ARGUMENT: AUTONOMOUS-WEAPONS MYTHSAutonomous Weapon Systems: No Human-in-the-Loop Required, and Other Myths Dispelled

Published 28 May 2025

“The United States has a strong policy on autonomy in weapon systems that simultaneously enables their development and deployment and ensures they could be used in an effective manner, meaning the systems work as intended, with the same minimal risk of accidents or errors that all weapon systems have,” Michael Horowitz writes.

When Pentagon officials, the think tank world, and various world leaders refer to autonomous weapon systems, they often cite a U.S. military policy requirement that doesn’t even exist.

Michael C. Horowitz writes in War on the Rocks that Autonomy in Weapon Systems, the Department of Defense Directive 3000.09 Published in 2012 and updated in 2023,  governs the Pentagon’s deployment and use of semi-autonomous and autonomous weapon systems.

Horowitz writes

An autonomous weapon system is a weapon system that, once activated, can select and engage targets without further intervention by an operator. A semi-autonomous weapon system is something like the precision-guided weapons of today. Most prominently, the policy requires that, for some kinds of autonomous weapon systems, senior Defense Department leaders have to do two extra rounds of review, on top of the usual checks all weapon systems go through. This happens once before the system is approved to enter the acquisition pipeline and again before it’s used in the field. The reviews use a simple checklist, based on rules that already exist, to make sure any proposed autonomous weapon system works as it should and follows U.S. law.

Unfortunately, there are myths about current U.S. policy on autonomy in weapon systems that are creating imaginary — and then real — barriers to the U.S. military developing and deploying greater autonomy. And I should know, since the office I worked in in the Pentagon rewrote the updated directive during the Biden administration.

Horowitz highlights the three main myths:

·  Myth #1: Fully Autonomous Weapon Systems Are Prohibited

·  Myth #2: Humans Must Be in the Tactical Loop

·  Myth #3: There are Limits in Research and Development, Prototyping, and Experimentation on Autonomous Weapon Systems

Horowitz concludes:

The United States has a strong policy on autonomy in weapon systems that simultaneously enables their development and deployment and ensures they could be used in an effective manner, meaning the systems work as intended, with the same minimal risk of accidents or errors that all weapon systems have. Department of Defense Directive 3000.09 should reinforce confidence that any autonomous weapon systems the U.S. military develops and fields would enhance the capabilities of the military and comply with international humanitarian law and the law of armed conflict. Addressing these myths can help turn that into a reality.

The Trump administration could, of course, decide to revise or even replace the directive, but at present it still governs policy on autonomy in weapon systems. Currently, policy requires additional review of some kinds of autonomous weapon systems, but does not prohibit anything or require a human in the loop. Instead, the requirements in the directive are an aggregation of the requirements that all weapon systems need to meet to ensure they can be used effectively in ways that enhance the ability of the United States military to achieve its objectives in a war. Thus, following the requirements does not place an undue burden on any military service that wishes to develop an autonomous weapon system. They just need to prove it can be effectively and legally used, like any weapon system.

However, these continuing misinterpretations about Department of Defense policy threaten to undermine the adoption of autonomy in weapon systems with responsible speed. Moving forward, the Department of Defense should more clearly communicate to its stakeholder communities that defense policy does not prohibit or restrict autonomous weapon systems of any sort. It only requires that some autonomous weapon systems go through an additional review process on top of the reviews that all weapon systems are required to undergo.

The Department of Defense should also direct officials across the services to discuss the importance of human responsibility for the use of force, rather than the need for a human in the loop, given the way the conflation of tactical and operational loops can quickly lead to confusion.

Finally, the existence of the directive, however, provides a reminder to senior leaders to take an extra look at autonomous weapon systems that might otherwise raise eyebrows or where operators might have initial hesitation about using them. By ensuring that capabilities go through the review process, the Department of Defense can increase trust and confidence among warfighters in ways that would make their end use, if needed, more effective.

Finally, the directive also sends a strong signal internationally. In concert with the Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy, the directive provides a role model for capacity building as countries make their own policy decisions about incorporating autonomy into their weapon systems, building on lessons learned from the Russo-Ukrainian War or elsewhere.

 

Leave a comment

Register for your own account so you may participate in comment discussion. Please read the Comment Guidelines before posting. By leaving a comment, you agree to abide by our Comment Guidelines, our Privacy Policy, and Terms of Use. Please stay on topic, be civil, and be brief. Names are displayed with all comments. Learn more about Joining our Web Community.