The report debunks the ‘killer robot’ fantasy

The killer robot, embodied in the image of the Terminator, fuels the imagination between immortality, transhumanism and apocalyptic scenes. This sulfur fantasy continues to dominate the debate on the question of the autonomy of the weapon system. In the name of morality and law, civil society condemns the opening of a Pandora’s box, predicts the arrival of robots on the battlefield, and calls for a preventive ban on the deadly autonomous weapons system (Sala).

Analysis. There are no restrictions on deadly robots yet

For their part, arms manufacturers and the military have long been working on the use of artificial intelligence and robotics in weapons systems. Thus, automation already applies to a variety of functions, such as navigation, observation, recognition, and the acquisition of targets or fire control. Depending on the level of automation of the weapon system, a continuum goes from a fully tele-operated armed system to an autonomous armed system without human supervision, the latter is still a matter of science fiction.

Semantic confusion

In a study published on Tuesday, May 24, (1) researcher from the French Institute of International Relations (IFRI), Lor de Rochegonde, was invited. “Defeat any kind of manicism”, Time to notice “The existence of two parallel debates” More and more isolated first, moral and political, relates to the preventive regulations or prohibitions of Salar. The second, from a technical, military point of view, expressed concern about the possible and desirable degree of effective autonomy. Moreover, the semantic confusion around the concept of autonomy is detrimental to a rational approach.

→ Investigation. Whether or not to consider “enhanced troops”: the military in morality and performance

“The stories suggest that an autonomous weapons system would ‘self-select’ to assign one target to another and set its own mission out of human control.” Emphasizes the researcher. However, according to experts, it is rather a form of human-supervised autonomy that should see the light of day in the coming decades regarding fire decisions.Technological advances will be such that we can grow without people in the loopEmphasizes Laure de Rochegonde. With the accelerated pace of warfare, human intelligence will not be fast enough to respond adequately. A

An intermediate section

In France, the ethics committee formed by Florence Parley, Minister of the Armed Forces in 2019, has made a distinction between the Sala – a red line set by the authorities, which excludes their development and their use – and a division of arms. The mediators, who are said to be “deadly weapons systems include autonomy” (Salia). One way to pave the way for weapons that would certainly kill is to be “Unable to work alone without human control, to change the rules of their busyness and to take lethal initiatives”.

Three projects emphasize the Autonomous Weapons System: Future Air Combat System (SCAF), Future Battle Tank (MGCS) and Future Mine Countermeasure System (Slam-F). Where Israel, South Korea, Turkey, Iran, Pakistan, the United Kingdom, France and Estonia also want to do better.

Multilateral approach

On the regulatory front, Paris advocates a multilateral approach within the UN Framework on the Convention on Certain Conventional Weapons (CCW). Among the “disarmament” states in favor of Salar’s preventive sanctions, France maintains a moderate position in terms of the alliance of NGOs in the “Stop Killer Robots” campaign and in favor of license-fair like the United States.

In contrast, other states are advocating for an ad hoc process, which is considered more favorable to a deterrent ban, such as the Ottawa Convention on the Prohibition of Anti-Mine Action and the Oslo Convention on Arms.

Leave a Comment