Deadly robots banned in Geneva

Switzerland is at the forefront of robotics and artificial intelligence, two areas of research whose results could be used for both civilian and military purposes, but for which there is no international control.

This content was published on June 23, 2022 – 09:45

According to a reportExternal links The United Nations, the Libyan government used a cargo-2 quadcopter during the civil war in March 2020. The drone “chased” a human target without receiving orders. For the first time, an autonomous deadly weapon – a killer robot – was used.

Developed using robotics and artificial intelligence (AI), this weapon system does not require human intervention. Autonomous drones, for example, are programmed to fly to a specific location, select an object, and destroy a target without human control. The Libyan incident proves that the deadly robot can operate autonomously. Unlike weapons of mass destruction, there are no specific treaties or measures that prohibit or prohibit these weapons and technologies internationally.

What is the control for weapons of mass destruction?

Weapons of mass destruction are weapons that have more destructive power than conventional weapons. These include nuclear, biological and chemical weapons (ABC weapons). These are capable of killing a large number of people and destroying the environment in a very short time.

For weapons of mass destruction, international law contains disarmament and non-proliferation treaties. Among them are the Nuclear Non-Proliferation Treaty, the Convention on Biological Weapons or the Chemical Weapons Convention. Their goal is to prevent the proliferation of nuclear weapons and to ban biological and chemical weapons worldwide.

There are also four politically binding multilateral regimes under which participating states expand and adjust their export controls: the Nuclear Suppliers Group (NSG), the Australia Group (AG), Missiles (MTCR) and the Wassenaar Arrangement (WA). Switzerland took part in all four.

Source: SECOExternal links

End of insertion

Opinions differ on whether this would be considered an error. Asked by, the Federal Department of Foreign Affairs (FDFA) replied that international humanitarian law applies to all weapons and technologies, including new types of weapons systems, such as autonomous systems. “So there is no gap in the use of robotics, artificial intelligence and other digital technologies in armed conflict.”

But this view is not shared unanimously by the international community. “Some states feel that existing legislation is not enough,” said Laura Brun, a specialist in new military and security technologies at the Stockholm International Peace Research Institute. International humanitarian law certainly applies to all types of weapons, but the use of AI-controlled military technology is not explicitly targeted. Depending on how the law is interpreted, there may be an ideological void, experts believe.

Laura Brunn adds, EU or UNESCO rules regarding the ethical use of AI in relation to civilian and non-military applications. With the advancement of new technologies such as artificial intelligence, it is becoming increasingly difficult to distinguish between civilian and military possibilities. Moreover, these technologies are very easy to disseminate – for example through email or AI software in open source – which further complicates the process of control and control.

“Of course, international humanitarian law applies to the use of such weapons, but new rules of international law are needed to consider new technologies,” said Elizabeth Hoffberger-Pipan, a security researcher and legal expert. International and Security Policy at the German Institute International Berlin.

Hopeless encounter in Geneva

A possible embargo on the autonomous weapons system has been under discussion at the UN in Geneva since 2017. Switzerland supports the policy of negotiation, because while it opposes a total embargo, it is in favor of a regulation, control and restriction. Last year the Swiss mission to the UN made a proposalExternal links Join a group of countries that insist on control of lethal autonomous weapons, legally binding measures.

But things are not moving forward: Russia rejects almost any offer of control. He even boycotted the last round of talks in March because of the war in Ukraine. But Israel, the United States, Turkey, Britain and South Korea do not want itExternal links Nor is it the case with the mandatory control of autonomous weapons systems, because they believe that international humanitarian law is sufficient for responsible use.

The last meeting of the UN committee will be in July. Experts do not expect big progress. The states are already speaking in secret about the failure of the Geneva talks. Asked by, the FDFA also indicates that there is currently no agreement between the states on an international instrument.

“It’s likely that not all states will continue to support the Geneva process, because that’s not the only price,” said Elizabeth Huffberger-Pipan. It therefore hopes that alternative agencies will be asked to discuss rules regarding autonomous weapons systems.

Why don’t states want sanctions?

If Switzerland, like most states, does not want a complete ban on autonomous weapons, it is for both economic and diplomatic reasons, explained Stephen Herzog of the Center for Security Studies at the Swiss Federal Institute of Technology (ETHZ) in Zurich. Switzerland fears that its exports will suffer. Indeed, the country is at the forefront of global advances in robotics and artificial intelligence.

Elizabeth Huffberger-Pipan understands this fear only partially. For the moment, it is a question of controlling the use of autonomous weapons systems at the level of international law, not of export controls, the researcher explained. But according to him, the fear of many countries that a complete ban would make research difficult in this case is justified: “Investors will ask: what does it mean to spend money if the resulting innovations are not used?” A real challenge, especially for the United States, but also for many other major military powers.

The United States argues that autonomous weapons should be tested before they can be banned. This will make it possible to determine if these weapons are actually being put to good use. Some countries even believe that autonomous weapons have advantages: the number of casualties and the cost of personnel can be reduced for whichever party possesses these weapons.

Example of life-saving use: During a practice at the German Criminal Police Office in Baden-W্টrttemberg, a robot disarmed a book with a fake explosive in his chest. Keystone / Marijan Murat

The Swiss government also opposed a complete ban in 2017External links For similar reasons. According to Bern, the risk could be to ban potentially useful systems, such as avoiding collateral damage to the population. That’s why, according to Laura Brun, discussions on controlling civilian and military applications need to go hand in hand. “Acknowledging that the line between these two uses is becoming increasingly blurred would be the first step towards technology control.”

Elizabeth Hoffberger-Pipan noticed a change in the paradigm of drones: although they had previously been highly criticized, they were increasingly accepted internationally, even by the population. In the war in Ukraine, for example, Ukrainian soldiers used civilian drones on a large scale in addition to military drones, giving them an unexpected advantage against Russia. Admittedly, the use of drones in the fight against terrorism remains a very delicate issue, both legally and morally, and Russia’s war of aggression in Ukraine is only partially comparable, but we can see through this example that drones are not always deprived by society. . There may also be opportunities for judicial use of weapons systems that operate largely autonomously.

“Times are changing and the military is moving towards modernization and therefore a better understanding of technology,” explained Elizabeth Huffberger-Pipan. So it is very possible that public opinion will also develop towards autonomous weapons.

What is a dual use item?

When a product or technology can be used for both civilian and military purposes, it is called dual use. Problem: An invention like nuclear power could provide mankind with civic benefits in the form of nuclear power plants (although they are also controversial) or medical treatment on the one hand and ruin lives on the other. Bombing does not mean banning this technology and research altogether in the field, but requires a responsible approach.

The Wassenaar Arrangement is an organization of states whose purpose is to contribute to the volatile stockpile of conventional weapons and dual-use products.

Source: SECOExternal links

End of insertion

According to JTI standards

According to JTI standards

Further: Certified by SWI Journalism Trust Initiative

Leave a Comment