Will Google respect the 3 laws of robotics?

… then it will be a matter of human decision-making ethical and technological choice, and when Google is at the center of the equation, it is very clever to predict whether the right card will be played.

Two things fill the heart with an appreciation and reverence, ever new and growing, as the thought holds and applies to it: the starry sky above me and the moral law within me. Through this long sentence by Emmanuel Kant, it is possible to approach the future challenges of science, which are associated with futurism, in two important areas of technological development, such as space exploration and robotics. The first imagines a superhuman desire to explore the horizon, which will eventually drive the species out of its homeland – whether it is necessary for its survival or not. And the exploration is unique in that it carries with it the unknown and the annoying, as much as desired: we will go, but where we will go, we still do not know how humanity will adapt, or our laws, customs and what not. The habit journey will survive.

Robotics, on the other hand, is much more tangible and much more mundane, almost more present for the known observer of his time. Machine improvements have become so commonplace that when the escalator at Mairi de Montreux station is accelerated, or when the new transparent doors of Terminal Navigo open in an elegant silence that fights science fiction. To imagine, often prefers the spectacular hub of mechanics that is going on. Street furniture is not the only technical achievement to be ignored on a daily basis: communication is instantaneous, omniscient and instantaneous, the kilometers have lost the power of their mental separation. Voice commands can be given on a smartphone which will perform them more or less efficiently. We’ve forgotten how automated the kitchen has become, it’s been a long time since we’ve forgotten to take care of the microwave or the various robots that are programmed to prepare them without the hassle.

Three basic laws

These technological advances that bothered people a century ago, the miraculous arrival in 2014, are familiar and almost invisible to the eye that sees them every day without any question about how they work. However, this same eye is much more critical in the case of imaginative robotics: the one that builds electronic devices on the assembly line, the one that runs autonomously over rough terrain, the one that kills from the sky, the one that takes a humanoid. Forms to help people where he shows his limits; Finally, the one who dares to carry the name of artificial intelligence.

“The first of these laws indicates that a robot” Cannot harm a person, or, by being inactive, prevent a person from being in danger

And yet, Proto-robot In everyday life, pressure cookers, iPhones, escalators and automatic punchers are no less robotic than the upcoming models. This is why, essentially, ethical questions have been urgently raised by researchers and philosophers who have realized that robots are not a future, but a present. This is why it would be wise to understand how the future of science is being played out in the shadow of takeover bids and financial markets.

When robot psychologist Susan Calvin, the protagonist of Isaac Asimov’s many short stories, is asked about the dangers of robots, her answer is almost always the same: a robot cannot be dangerous. He cannot, because the basic level of his brain, the core of his thinking, is programmed to obey the three laws of robotics, which are essential for establishing a healthy relationship between robots and humans. The first of these laws states that a robot ” Cannot harm a person, or, by being inactive, prevent a person from being in danger “The second law says that a robot” They must obey orders given by a human being, unless such an order conflicts with the first law “Finally, a robot that requires a third law” Protects its existence unless that protection conflicts with the first or second law

In a short story titled “The Proof”, Calvin joined Kant to further the definition of the three laws, or at least, to provide a humanitarian basis for these principles. ” The three laws constitute essential guiding principles for a large part of the moral system “, He says.” Clearly, every human being has, in principle, an instinct for self-preservation. […] Likewise, every good person with a social conscience and a sense of responsibility must obey the established authorities, his doctor, his boss, his government, his psychiatrist, his colleagues, even when these interfere with his well-being. Security These are the first two laws that are covered by the most basic moral principles – and Dr. Calvin concludes: ” All We will Man must love his neighbor as himself, risking his life to save others. “If anyone behaves like that” He may be a robot, but he may also be a very brave man

This very brief passage in Asimov’s work is of particular importance if we consider that the author has admired the real challenges of current and future robotics. Susan Calvin’s tirade not only assumes that robotics must be based on ethical principles, but it clearly implies that they are human. Good Those who have decided how robotics will fit into the future of humanity. Just step aside and watch a story like that Star warsLike a series Doctor WHO Or a universe like that Terminator Understand that robots are not systematically driven by universal values ​​of good.

Skynet has outgrown its creator and wants to destroy it; The trade federation obeyed the order to kill Droid JD and anything in their way. Let us barely mention Cyberman Whose only philosophy is the complete assimilation of humanity into their great robotic empire.

In the military field, the question of drone responsibility has already arisen: ethics committees fight so that the decision to open fire remains humane while the maximum total automation of combat sometimes attracts personnel and representatives of arms companies. But this Borderline case, which gives the ultimate permission to challenge the morality of the robot facing the murder order, is only revealed in the war, where, according to the sad definition, the rules of civil society are violated. To be interested only in this range is to downplay the importance of robotics in the development of civilian technology and to consider that the reflexes applicable to military drones cannot be extended to a wider field, which is of concern to humanity as a whole.

Two things fill the heart with an appreciation and reverence, ever new and growing, as the thought holds and applies to it: the starry sky above me and the moral law within me. A

Criticism of practical reasonsE. Kant

Leave a Comment