By Bayley Weese
Humans conceived the idea of artificial life as far back as 1823 when Mary Shelley wrote her novel Frankenstein, about a famous monster created from an unorthodox science experiment. While we have come a long way since then, and our fascination with artificial intelligence and robots has only increased. In the world today, both technology and robots surround us – aiming to improve both the quality and the efficiency of the life we live. Just this past month the European Parliament’s legal affairs committee considered what these advances in robotic technology could mean in the legal world.
The legal affairs committee described the need to create a specific legal status for robots called “electronic persons,” and discussed the necessity for “kill switches” to be incorporated into their designs. The committee goes as far as to suggest creating an entire European agency that would be in charge of responding to technological changes and advancements with robots. Europe has felt the need to consider a robot’s legal status as imperative and describes the need for electronic personhood as “the most sophisticated autonomous robots could be established as having the status of electronic persons with specific rights and obligations, including that of making good any damage they cause.” Although it appears as if making robots legal persons is equating them to humans, the committee makes it clear that this is not so – declaring outright in their report that a robot is not a human and will never be a human.
The United States, on the other hand, has made no advancements regarding recognizing robots as legal entities – but they should. It is extremely likely that with the rise of both technology and artificial life, robots will be humanity’s first interaction with intelligent, non-human life forms and despite being of our own creation, these robots may contain the capacity to outdo humans both mentally and physically. While the European proposal gives “rights and obligations” to these new electronic persons, it does not specify what rights and obligations are included other than an obligation to fix any damage they cause. Because Europe is not outfitted with a Constitution similar to that of the United States they may not have to consider what exact rights and obligations robots would have, while the United States would and should.
The United States Constitution provides the American people with a variety of rights such as the right to free speech, the right to due process, etc. But which ones should be afforded to robots if the United States should consider such a legal policy? The European model calls for an obligation for an electronic person, or robot, to make right any wrongs that they have caused. This doesn’t parallel nicely in the United States, as before persons here can be made to make right any wrongs they have caused, they are entitled to a fair trial, to have counsel appointed, etc. all rights that do not necessarily exist in the European model. Because there would be so much more to consider in the United States, we should follow Europe’s idea and begin the discussion around artificial intelligence and robots becoming legal entities, sooner rather than later.
Overall, Europe is making the right decision in deciding to both discuss and enact legal policy and laws regarding the legal status of robots with artificial intelligence. Technology is only going to continue to expand in the immediate future, and eventually humanity will be faced with robots and artificial intelligence it could never have dreamed of. Because the United States legal system innately comes with more rights and obligations than European systems, it would be wise for us to begin thinking about robotics in legal terms – before it’s too late.
Bio: Bayley is a 2L at Suffolk University Law School with a concentration is Trial and Appellate Advocacy. She is a Staff Member of the Journal of High Technology Law and a Team Member of Suffolk’s National Health Law Moot Court Trial Team.
Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.