Germany Warns: AI Arms Race Already Underway

Research by the Center for Strategic and International Studies showed that Azerbaijan had a massive edge in loitering munitions, with more than 200 units of four sophisticated Israeli designs. Armenia had a single domestic model at its disposal.

Other militaries took note.

Since the conflict, you could definitely see a certain uptick in interest in loitering munitions,” said Franke. “We have seen more armed forces around the world acquiring or wanting to acquire these loitering munitions.”

Drone Swarms and “Flash Wars”
This is just the beginning. Looking ahead, AI-driven technologies such as swarming will come into military use — enabling many drones to operate together as a lethal whole.

You could take out an air defense system, for example,” said Martijn Rasser of the Center for a New American Security, a think tank based in Washington, D.C.

You throw so much mass at it and so many numbers that the system is overwhelmed. This, of course, has a lot of tactical benefits on a battlefield,” he told DW. “No surprise, a lot of countries are very interested in pursuing these types of capabilities.”

The scale and speed of swarming open up the prospect of military clashes so rapid and complex that humans cannot follow them, further fueling an arms race dynamic.

As Ulrike Franke explained: “Some actors may be forced to adopt a certain level of autonomy, at least defensively, because human beings would not be able to deal with autonomous attacks as fast.”

This critical factor of speed could even lead to wars that erupt out of nowhere, with autonomous systems reacting to each other in a spiral of escalation. “In the literature we call these ‘flash wars’,” Franke said, “an accidental military conflict that you didn’t want.”

A Move to “Stop Killer Robots”
Bonnie Docherty has made it her mission to prevent such a future. A Harvard Law School lecturer, she is an architect of the Campaign to Stop Killer Robots, an alliance of nongovernmental organizations demanding a global treaty to ban lethal autonomous weapons.

The overarching obligation of the treaty should be to maintain meaningful human control over the use of force,” Docherty told DW. “It should be a treaty that governs all weapons operating with autonomy that choose targets and fire on them based on sensor’s inputs rather than human inputs.”

The campaign has been focused on talks in Geneva under the umbrella of the UN Convention on Certain Conventional Weapons, which seeks to control weapons deemed to cause unjustifiable suffering.

It has been slow going. The process has yielded a set of “guiding principles,” including that autonomous weapons be subject to human rights law, and that humans have ultimate responsibility for their use. But these simply form a basis for more discussions.

Docherty fears that the consensus-bound Geneva process may be thwarted by powers that have no interest in a treaty.

Russia has been particularly vehement in its objections,” Docherty said.

But it’s not alone. “Some of the other states developing autonomous weapon systems such as Israel, the US, the United Kingdom and others have certainly been unsupportive of a new treaty.”

Time for a Rethink?
Docherty is calling for a new approach if the next round of Geneva talks due later this year makes no progress. She has proposed “an independent process, guided by states that actually are serious about this issue and willing to develop strong standards to regulate these weapon systems.”

But many are wary of this idea. Germany’s foreign minister has been a vocal proponent of a ban, but he does not support the Campaign to Stop Killer Robots.

We don’t reject it in substance — we’re just saying that we want others to be included,” Heiko Maas told DW. “Military powers that are technologically in a position not just to develop autonomous weapons but also to use them.”

Maas does agree that a treaty must be the ultimate goal. “Just like we managed to do with nuclear weapons over many decades, we have to forge international treaties on new weapons technologies,” he said. “They need to make clear that we agree that some developments that are technically possible are not acceptable and must be prohibited globally.”

What next?

But for now, there is no consensus. For Franke, the best the world can hope for may be norms around how technologies are used. “You agree, for example, to use certain capabilities only in a defensive way, or only against machines rather than humans, or only in certain contexts,” she said.

Even this will be a challenge. “Agreeing to that and then implementing that is just much harder than some of the old arms control agreements,” she said.

And while diplomats tiptoe around these hurdles, the technology marches on.

The world must take an interest in the fact that we’re moving toward a situation with cyber or autonomous weapons where everyone can do as they please,” said Maas. “We don’t want that.”

For more, watch the full documentary Future Wars on YouTube.

Richard Walker is Chief International Editor at DW. This article is published courtesy of Deutsche Welle (DW).