Killer robotsUN mulling rules to govern autonomous killer robots

Published 16 May 2014

On Tuesday, delegates from several international organizations and governments around the world began the first of many round of talks dealing with   some call “lethal autonomous weapons systems” (LAWS), and others call “killer robots.” Supporters of LAWS say the technology offers life-saving potential in warfare, as these robots y are able to get closer than troops to assess threats without letting emotions interfere in their decisions. This is precisely what concerns critics of the technology. “If we don’t inject a moral and ethical discussion into this, we won’t control warfare,” said one of them.

On Tuesday, delegates from several international organizations and governments around the world began the first of many round of talks dealing with   some call “lethal autonomous weapons systems” (LAWS), and others call “killer robots.”.

I urge delegates to take bold action,” said Michael Moeller, head of the UN Conference on Disarmament. “All too often international law only responds to atrocities and suffering once it has happened. You have the opportunity to take pre-emptive action and ensure that the ultimate decision to end life remains firmly under human control,” he said.

According to Agence France-Presse, the four days of discussions in Geneva have focused on technological developments, along with the ethical and sociological issues that must be addressed as military researchers will soon be able to design robots capable of terminating targets without human control. Participants in the meeting included members of UN organizations, the International Committee of the Red Cross, and non-governmental organizations.

In a November 2013 report by UN secretary-general Ban Ki-moon, titled Protection of Civilians in Armed Conflict, Ki-moon raised questions about the ability of killer robots to operate in accordance with international humanitarian and human rights law. “Is it morally acceptable to delegate decisions about the use of lethal force to such systems? If their use results in a war crime or serious human rights violation, who would be legally responsible? If responsibility cannot be determined as required by international law, is it legal or ethical to deploy such systems?” Ki-moon added, “although autonomous weapons systems have not yet been deployed and the extent of their development as a military technology remains unclear, discussion of such questions must begin immediately and not once the technology has been developed and proliferated.”

The meeting in Geneva is expected to be the first in a series of measures taken to address the use of killer robots on the battlefield. Experts predict that military researchers could produce the first generation of killer robots within twenty years. “Lethal autonomous weapons systems are rightly described at the next revolution in military technology, on par with the introduction of gunpowder and nuclear weapons,” Pakistan’s UN ambassador Zamir Akram told the meeting.

Some diplomats at the meeting agree that the goal is not to ban the rapidly advancing technology, but to set standards on legitimate uses. “We need to keep in mind that these are dual technologies and could have numerous civilian, peaceful and legitimate uses. This must not be about restricting research in this field,” said French ambassador Jean-Hugues Simon-Michel, chairman of the talks. Similar technology is being developed for fire-fighting and bomb disposal.

Supporters of robot weapons say the technology offers life-saving potential in warfare, as they are able to get closer than troops to assess threats without letting emotions interfere in their decisions. This is precisely what concerns critics. “If we don’t inject a moral and ethical discussion into this, we won’t control warfare,” said Jody Williams, winner of the 1997 Nobel Peace Prize for her campaign for a land-mine ban treaty.