Tesla CEO Elon Musk, Google Deepmind Founder Mustafa Suleyman, and Universal Robotics Founder Esben Østergaard are among the 116 robotics and artificial intelligence founders and experts who signed a recent open letter to the UN asking for a ban on lethal autonomous weapons systems (LAWS), more colloquially known as, “killer robots.”
The letter reads in part:
“Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”
The letter, which represents the first time representatives of the robotics and AI industry have voiced a stance on LAWS, was released at the opening of the International Joint Conference on Artificial Intelligence (IJCAI 2017 ) in Melbourne, Australia, and came in the wake of the cancellation of a UN meeting of a group of government experts (GGEs) to discuss LAWS that was to take place from August 21- 25. The meeting was canceled due to some states failing to pay their financial contributions to the UN. The letter calls for the High Contracting Parties to double up their efforts at the meeting, which has been rescheduled for November 13-17.
|The Modular Advanced Armed Robotic System (MAARS) by QinetiQ is just one example of the number of autonomous weapons being actively developed or deployed on the battlefield. (Image source: QinetiQ)|
The letter was organized by Toby Walsh, Scientia Professor of Artificial Intelligence at the University of New South Wales in Sydney, Australia, who also previously headed up a 2015 open letter calling for a ban on autonomous weapons. To date that letter has been signed by over 20,000 people, including AI and robotics researchers and others including Stephen Hawking, Steve Jobs, and Noam Chomsky.
“Nearly every technology can be used for good and bad, and artificial intelligence is no different,” Walsh said in a statement regarding the 2017 letter. “It can help tackle many of the pressing problems facing society today: inequality and poverty, the challenges posed by climate change and the ongoing global financial crisis. However, the same technology can also be used in autonomous weapons to industrialize war. We need to make decisions today choosing which of these futures we want.”
The UN first convened a meeting around LAWS in 2014, but since then there has been no progress toward any actual ban or regulation targeted specifically at autonomous weapons. The issue was first brought to international attention in 2012 with the release of a report from Human Rights Watch and Harvard Law School’s International Human Rights Clinic titled, Losing Humanity: The Case Against Killer Robots . The report recommends three actions for the international community to take against LAWS:
"Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument."
"Adopt national laws