TrendCan robots commit war crimes?

Published 29 February 2008

As the move continues toward autonomous killing machines — robots which spot, identify, and kill on their own, without human intervention — questions are raised about moral, ethical, and legal aspects of this trend

We wrote a few days ago about the coming of decision-making killer robots, that is, robots capable of spotting, identifying, and killing human beings (enemy soldiers and terrorists, it is hoped) on their own. There is a disagreement, though, about what, exactly, an “autonomous killing machine” is. A participants in a recent conference on technology and war said that a landmine could be regarded as such a machine.

As we ponder the definition of autonomous killing machine, we may want to consider this question as well: Can robots commit war crimes? Barrister and engineer Chris Elliot explained his thoughts on the legality of future “intelligent” weapons, within international, criminal and civil law. He started by suggesting that as systems become more autonomous, they become capable of actions that are not, in legal terms, “foreseeable.”

At that point, he suggested, it would be hard to blame a human for its actions. “We’re getting very close to the where the law may have to recognize that we can’t always identify an individual - perhaps an artificial system can be to blame.”

After that provocative suggestion, he said that it would currently be illegal for any state to deploy a fully autonomous system. “Weapons intrinsically incapable of distinguishing between civilian and military targets are illegal,” he said.

Only when war robots can pass a “military Turing test” could they legally be let off the leash, Elliot added. “That means an autonomous system should be no worse than a human at taking decisions [about valid targets].” The original Turing test uses conversation to see if a human can tell the difference between man and machine, this test would use decisions about legitimate targets as the test instead.

How such a test might be administered, Elliot didn’t say. “Unless we reach that point, we are unable to [legally] deploy autonomous systems. Legality is a major barrier.”