February 06, 2023
NOTEWORTHY: DoD Autonomous Weapons Policy
Originating Component: Office of the Under Secretary of Defense for Policy
Effective: January 25, 2023
Releasability: Cleared for public release. Available on the Directives Division Website at https://www.esd.whs.mil/DD/.
Reissues and Cancels: DoD Directive 3000.09, “Autonomy in Weapon Systems,” November 21, 2012
Approved by: Kathleen H. Hicks, Deputy Secretary of Defense
Purpose: This directive:
• Establishes policy and assigns responsibilities for developing and using autonomous and semi- autonomous functions in weapon systems, including armed platforms that are remotely operated or operated by onboard personnel.
The old Directive said, “manned and unmanned platforms.” This is a slight change, both in substance and style. The new specification of “armed platforms” clarifies that the Directive only applies to armed systems, not unarmed vehicles, such as a surveillance drone that does not carry weapons. The terms “unmanned” and “manned” are scrapped in favor of “remotely operated” and “operated by onboard personnel,” respectively.
The Directive has lots of minor changes like this throughout. I won’t flag all of them; I’ll keep my comments only to the most significant changes. For a side-by-side comparison of the new and old versions, see https://draftable.com/compare/eiPezMBRKKXB
• Establishes guidelines designed to minimize the probability and consequences of failures in autonomous and semi-autonomous weapon systems that could lead to unintended engagements.
• Establishes the Autonomous Weapon Systems Working Group.
The new Directive establishes an “Autonomous Weapon Systems Working Group,” whose responsibilities are delineated in section 5. One of the main purposes of the working group is to support the review and approval process, including advising parts of DoD on whether a weapon system requires approval. This addition puts the bureaucratic machinery in place to support the review process.
SECTION 1: GENERAL ISSUANCE INFORMATION
1.1. APPLICABILITY
a. This directive applies to:
(1) OSD, the Military Departments, the Office of the Chairman of the Joint Chiefs of Staff (CJCS) and the Joint Staff, the Combatant Commands, the Office of Inspector General of the Department of Defense, the Defense Agencies, the DoD Field Activities, and all other organizational entities within the DoD.
(2) The design, development, acquisition, testing, fielding, and employment of autonomous and semi-autonomous weapon systems, including guided munitions that are capable of automated target selection.
(3) The application of lethal or non-lethal, kinetic or non-kinetic, force by autonomous or semi-autonomous weapon systems.
b. This directive does not apply to:
(1) Autonomous or semi-autonomous cyberspace capabilities.
(2) Unarmed platforms, whether remotely operated or operated by onboard personnel, and whether autonomous or semi-autonomous.
(3) Unguided munitions.
(4) Munitions manually guided by the operator (e.g., laser- or wire-guided munitions).
(5) Mines.
(6) Unexploded explosive ordnance.
(7) Autonomous or semi-autonomous systems that are not weapon systems.
1.2. POLICY.
a. Autonomous and semi-autonomous weapon systems will be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.
This language has *not* changed. The phrase “appropriate levels of human judgment” has been a centerpiece of the U.S. government’s position in international discussions on autonomous weapons at the United Nations. It would have been very significant if the DoD had changed this language, but they didn’t.
(1) Systems will go through rigorous hardware and software verification and validation (V&V) and realistic system developmental and operational test and evaluation (T&E) in accordance with Section 3. Training, doctrine, and tactics, techniques, and procedures (TTPs) applicable to the system in question will be established. These measures will provide sufficient confidence that autonomous and semi-autonomous weapon systems:
This language changed from “ensure” to “provide sufficient confidence”, a softer requirement.
(a) Function as anticipated in realistic operational environments against adaptive adversaries taking realistic and practicable countermeasures.
(b) Complete engagements within a timeframe and geographic area, as well as other relevant environmental and operational constraints, consistent with commander and operator intentions. If unable to do so, the systems will terminate the engagement or obtain additional operator input before continuing the engagement.
This is new. The original language only included the requirement for a time duration on weapons. This new language adds a geographic constraint and leaves open the possibility of other environmental and operational constraints as well.
This paragraph means that the DoD cannot field autonomous weapons that are unbounded in time and geography.
(c) Are sufficiently robust to minimize the probability and consequences of failures.
(2) Consistent with the potential consequences of an unintended engagement or unauthorized parties interfering with the operation of the system, physical hardware and software will be designed with appropriate:
(a) System safety, anti-tamper mechanisms, and cybersecurity in accordance with DoD Instruction (DoDI) 8500.01 and Military Standard 882E.
(b) Human-machine interfaces and controls.
(c) Technologies and data sources that are transparent to, auditable by, and explainable by relevant personnel.
This is new. One of the new technological developments since the original Directive has been the deep learning revolution, which kicked off in 2012. Deep learning systems, which use deep neural networks, often have problems with transparency and explainability. This new language tracks requirements in the DoD AI Ethical Principles.
(3) For operators to make informed and appropriate decisions regarding the engagement of targets, the human-machine interface for autonomous and semi-autonomous weapon systems will:
(a) Be readily understandable to trained operators, such as by clearly indicating what actions operators need to perform and which actions the system will perform.
(b) Provide transparent feedback on system status.
functions.
(c) Provide clear procedures for trained operators to activate and deactivate system
b. Persons who authorize the use of, direct the use of, or operate autonomous and semi- autonomous weapon systems will do so with appropriate care and in accordance with the law of war, applicable treaties, weapon system safety rules, and applicable rules of engagement (ROE). The use of AI capabilities in autonomous or semi-autonomous weapons systems will be consistent with the DoD AI Ethical Principles, as provided in Paragraph 1.2.f.
This is new. One of the important DoD policy developments since 2012 was the establishment of the DoD AI Ethical Principles, which were released in 2020. Prior to this updated Directive, it was unclear how the autonomy Directive and the AI Ethical Principles related to one another. One of the things DoD had to do in the new Directive was answer that question, which this section does.
c. With the exception of systems intended to be used in a manner that falls within the policies in Paragraphs 1.2.d.(1) through 1.2.d.(4), autonomous weapon systems,
Now we get into the heart of the Directive. This section, 1.2.c, and the following one, 1.2.d, outline in broad strokes the review and approval process and which kinds of weapons must be reviewed. The essence of the review process is largely unchanged, but the way that the review process is explained has changed, presumably with the goal of increasing clarity.
including weapon systems with both autonomous and semi-autonomous modes of operation, must be approved by the Under Secretary of Defense for Policy (USD(P)), the Under Secretary of Defense for Research and Engineering (USD(R&E)), and the Vice Chairman of the Joint Chiefs of Staff (VCJCS) before formal development.
One of the organizational changes the new Directive had to account for was the split of AT&L into A&S and R&E, which happened in 2018. At its core, the Directive gives guidance to internal DoD components on what actions they need to take with respect to autonomy in weapons. That means that AT&L duties outlined in the old Directive needed to be divvied up between A&S and R&E.
In the old Directive, USD(AT&L) was one of the three approval authorities, alongside USD(P) and VCJCS, at both phases of the review process: before formal development and again before fielding. This section separates those responsibilities, with USD(R&E) being part of the approval process before formal development and USD(A&S) before fielding.
They must be approved again by the USD(P), the Under Secretary of Defense for Acquisition and Sustainment (USD(A&S)), and the VCJCS before fielding. These requirements for approval are supplementary to the requirements in other applicable policies and issuances. Autonomous weapon systems requiring these senior approvals in accordance with Section 4 of this directive before formal development and again before fielding include:
(1) Autonomous weapon systems that have not previously been reviewed and approved in accordance with this directive, including autonomous weapon systems that are modifications of an existing non-autonomous weapon system.
(2) Modified versions of previously approved autonomous weapon systems whose system algorithms, intended mission sets, intended operational environments, intended target sets, or expected adversarial countermeasures substantially differ from those applicable to the previously approved weapon systems so as to fall outside the scope of what was previously approved in the senior review. Such modified systems require a new senior review and approval before formal development and again before fielding.
This is new and a particularly consequential addition. This paragraph not only clarifies that previously approved weapons which are modified must be re-approved, it also gives guidance about what kinds of modifications would require re-approval.
d. The senior review described in Paragraph 1.2.c is not required for weapon systems intended to be used in the manner described in Paragraphs 1.2.d.(1) through 1.2.d.(4). These will be considered for approval in accordance with applicable policies and issuances, such as applicable issuances related to the Defense Acquisition System. Weapon systems that do not require the senior review provided in Paragraph 1.2.c are:
(1) Semi-autonomous weapon systems used to apply lethal or non-lethal, kinetic or non- kinetic, force without any modes of operation in which they are intended to function as an autonomous weapon system.
This language is new and clarifies that if a weapon has an autonomous mode, it must be reviewed.
Many weapon systems, such as the Aegis combat system on U.S. Navy ships and the Army’s Patriot air and missile defense system, have modes of operation with varying levels of autonomy. This language is new and clarifies that if a weapon has an autonomous mode, it must be reviewed.
(2) Operator-supervised autonomous weapon systems used to select and engage materiel targets for local defense to intercept attempted time-critical or saturation attacks for:
“Operator-supervised” replaces “Human-supervised” in the old Directive. “Operator” is defined in the glossary as "A person who operates a platform or weapon system.”
The old Directive said, “targets, with the exception of selecting humans as targets.” In its place, the new Directive simplifies that language to “materiel targets”, the U.S. military’s term for equipment, not people. The substance remains the same. This section authorizes supervised autonomous weapons that target objects (for example, aircraft, vehicles, rockets, missiles) but not people.
(a) Static defense of installations with personnel, including networked defense where the autonomous weapon system is not co-located with the installation.
This is new and expands the set of supervised autonomous weapons that are permitted. It removes the requirement in the old Directive that a supervised autonomous weapon be co-located with people, giving people physical control over the autonomous weapon, for example to manually shut it down in the event of a malfunction. Now, networked defense is permitted even if the weapon is not co-located with military personnel.
(b) Onboard and/or networked defense of platforms with onboard personnel.
(3) Operator-supervised autonomous weapon systems used to select and engage materiel targets for defending operationally deployed remotely piloted or autonomous vehicles and/or vessels.
This is new. Similar to the addition of networked defense in the previous section, this eliminates the requirement for humans to be co-located with the autonomous weapon. Under the new Directive, the DoD can place a supervised autonomous weapon, such as an Aegis, on an uncrewed (“remotely piloted or autonomous”) vehicle without any special approval. The old Directive required senior-level approval before doing so. Interestingly, the caveat “for local defense to intercept attempted time-critical or saturation attacks” (1.2.d.(2)) that applies to defending human-occupied installations and platforms is not included for supervised autonomous weapons defending uncrewed vehicles.
(4) Autonomous weapon systems used to apply non-lethal, non-kinetic force against materiel targets in accordance with DoDD 3000.03E.
e. International sales or transfers of autonomous and semi-autonomous weapon systems will be approved in accordance with existing technology security and foreign disclosure requirements and processes in accordance with DoDD 5111.21.
f. The design, development, deployment, and use of AI capabilities in autonomous and semi- autonomous weapon systems will be consistent with the DoD AI Ethical Principles and the DoD Responsible Artificial Intelligence Strategy and Implementation Pathway. The DoD AI Ethical Principles, as adopted in the February 21, 2020 Secretary of Defense Memorandum, are:
This section is new and connects the Directive to the other two key pieces of foundational DoD guidance relating to AI and autonomy, the DoD AI Ethical Principles and the DoD’s AI strategy published in 2022. The ethical principles are restated below.
(1) Responsible.
DoD personnel will exercise appropriate levels of judgment and care, while remaining responsible for the development, deployment, and use of AI capabilities.
(2) Equitable.
The DoD will take deliberate steps to minimize unintended bias in AI capabilities.
(3) Traceable.
The DoD’s AI capabilities will be developed and deployed such that relevant personnel possess an appropriate understanding of the technology, development processes, and operational methods applicable to AI capabilities, including with transparent and auditable methodologies, data sources, and design procedures and documentation.
(4) Reliable.
The DoD’s AI capabilities will have explicit, well-defined uses, and the safety, security, and effectiveness of such capabilities will be subject to testing and assurance within those defined uses across their entire life cycles.
(5) Governable.
The DoD will design and engineer AI capabilities to fulfill their intended functions while possessing the ability to detect and avoid unintended consequences, and the ability to disengage or deactivate deployed systems that demonstrate unintended behavior.
SECTION 2: RESPONSIBILITIES
2.1. USD(P).
The USD(P):
a. Provides policy oversight for developing and employing autonomous and semi- autonomous weapon systems.
b. Receives requests for approval of systems submitted in accordance with Paragraph 1.2.c, and in coordination with the USD(A&S) or USD(R&E) and the VCJCS, reviews and considers for approval such systems.
This is new and addresses a practical question in the old Directive, which was who a DoD component should go to in order to start the review process. The new Directive now clarifies that requests for approval should be sent to USD(P).
c. Issues guidance to help implement this directive, and reviews, as necessary, the appropriateness of such guidance given the continual advancement of new technologies and changing warfighter needs.
This is new and clarifies that one of USD(P)’s responsibilities is to issue guidance on implementing the Directive. Given the number of changes in the Directive that are relating to clarifying processes, it seems safe to assume that there was a demand signal inside the Department for greater guidance on how to actually put the review process outlined in the Directive into practice. This language clarifies that USD(P) is responsible for issuing such guidance.
d. Approves the DoD position on international sales or transfers of autonomous and semi- autonomous weapon systems in accordance with existing technology security and foreign disclosure requirements and processes.
e. Supervises and assigns a chair for the Autonomous Weapon Systems Working Group, provides necessary logistical and administrative support for the working group, approves the charter for the working group, and provides guidance and terms of reference as needed.
This is new, consistent with the creation of the Autonomous Weapon Systems Working Group in the new Directive.
2.2. USD(A&S).
The USD(A&S):
a. In coordination with the USD(P) and the VCJCS, reviews and considers for approval weapon systems submitted before fielding in accordance with Paragraph 1.2.c.
b. Ensures that DoD guidance relating to the Defense Acquisition System includes a requirement to document the determination that an autonomous or semi-autonomous weapon system is intended to be used in a manner that falls within the policies in Paragraphs 1.2.d.(1) through 1.2.d.(4), and therefore does not require senior approval in accordance with this directive. This documentation should occur before formal development and again before fielding, regardless of the acquisition pathway that is applicable to that weapon system.
This is new and addresses another lingering, unanswered question from the old Directive about how it fit into DoD’s acquisition system. Previously, there was no requirement that a weapon system moving through the acquisition process be considered for whether or not it needs to go through the review outlined in the Directive. Essentially, there was no box to check in the acquisition process. This section now directs USD (A&S) to, in essence, create a box for DoD components to check as part of the acquisition process, forcing them to ask whether or not a weapon requires review. This helps to ensure that no weapon system that requires review slips through the cracks of the system.
2.3. USD(R&E).
The USD(R&E):
a. Oversees establishment of standards and evaluation metrics for developmental testing, safety certification, and reliability assessment of autonomous and semi-autonomous weapon systems, with particular attention to the risk of unintended engagements or operational interference by unauthorized parties.
b. Oversees establishment of science and technology and research and development priorities for autonomy in weapon systems, including the development of new methods of V&V and T&E and the establishment of minimum thresholds of risk and reliability for the performance of autonomy in weapon systems.
This is new. This will help answer the question of how safe and reliable an autonomous weapon needs to be in order to be approved.
c. Oversees formulation of concrete, testable requirements for all non-AI elements of autonomous and semi-autonomous weapon systems.
d. Collaborates with the Chief Digital and Artificial Intelligence Officer (CDAO) to formulate concrete, testable requirements for implementing the DoD AI Ethical Principles and the DoD Responsible AI Strategy and Implementation Pathway.
This is another addition to the Directive to connect it to other work DoD has done on autonomy and AI since 2012, including the DoD AI Ethical Principles, DoD’s AI strategy, and the new CDAO.
e. Oversees and evaluates the developmental testing of autonomous and semi-autonomous weapon systems to assess the risk of failures.
f. Develops and maintains workforce certification processes, talent management, and curricula to support T&E and V&V of autonomous and semi-autonomous weapon systems by DoD personnel.
This is new and a welcome focus on workforce and talent management. DoD can’t do the T&E and V&V that the Directive calls for without qualified personnel.
g. In coordination with the USD(P) and the VCJCS, reviews and considers for approval weapon systems submitted before entering formal development in accordance with Paragraph 1.2.c.
h. Coordinates with the Director, Operational Test and Evaluation (DOT&E) and the appropriate Secretary of a Military Department or Commander, United States Special Operations Command (USSOCOM) to provide for monitoring to identify and address when changes to the system design or operational environment require additional T&E to provide sufficient confidence that the system will continue to avoid unintended engagements and resist interference by unauthorized parties.
This is new and addresses some of the problems of brittleness that plague AI and autonomous systems. Additional T&E may be necessary to account for changes in the system or its operating environment, even after it is fielded.
2.4. UNDER SECRETARY OF DEFENSE FOR PERSONNEL AND READINESS.
In accordance with DoDD 1322.18, the Under Secretary of Defense for Personnel and Readiness oversees and establishes policy for:
a. Individual military training programs for the Total Force relating to autonomous and semi-autonomous weapon systems.
b. Individual and functional training programs for military personnel and the collective training programs of military units and staffs relating to autonomous and semi-autonomous weapon systems.
2.5. DOT&E.
The DOT&E:
a. Oversees development of realistic operational T&E standards for autonomous and semi- autonomous weapon systems, including requirements for data collection and standards for T&E of any changes to the system following initial operational T&E (IOT&E), in accordance with Paragraph 1.2.a.(1) and Section 3.
b. Evaluates whether autonomous and semi-autonomous weapon systems under DOT&E oversight have met standards for rigorous V&V and T&E in realistic operational conditions, including potential adversary action, to provide sufficient confidence that the probability and consequences of failures have been minimized.
c. Establishes standards for data collection post-fielding and monitoring and assessment by programs.
d. Coordinates with the USD(R&E) and the appropriate Secretary of a Military Department or Commander, USSOCOM to provide for monitoring to identify and address when changes to the system design or operational environment require additional T&E to provide sufficient confidence that the system will continue to avoid unintended engagements and resist interference by unauthorized parties.
These sections are new and, similar to the new language under USD (R&E), cover monitoring systems after fielding to assess their performance and undertake additional T&E, if needed.
e. Reviews and approves operational and live fire test plans for autonomous and semi- autonomous weapon systems for Major Defense Acquisition Programs and programs designated for DOT&E oversight.
2.6. GENERAL COUNSEL OF THE DEPARTMENT OF DEFENSE (GC DOD).
In accordance with DoDD 5000.01, DoDD 2311.01, DoDD 5145.01, and, where applicable, DoDD 3000.03E, the GC DoD provides for guidance on, and coordination of, significant legal issues in autonomy in weapon systems. The GC DoD also coordinates on the review of the legality of weapon systems submitted in accordance with Paragraph 1.2.c.
2.7. ASSISTANT TO THE SECRETARY OF DEFENSE FOR PUBLIC AFFAIRS.
The Assistant to the Secretary of Defense for Public Affairs coordinates on the development of guidance on public affairs matters concerning autonomous and semi-autonomous weapon systems and the use of such guidance and approves final guidance release.
2.8. CDAO.
The CDAO:
The Chief Digital and Artificial Intelligence Officer (CDAO) was created in 2022. This section is new and assigns responsibilities to the CDAO pertaining to autonomy in weapons. This is another area where DoD is connecting the Directive with other AI-related initiatives it has taken since 2012.
a. Monitors and evaluates AI capabilities in and cybersecurity for autonomous and semi- autonomous weapon systems, in accordance with Paragraph 1.2.a.(2)(a) of this directive and DoDI 8500.01, and advises the Secretary of Defense on such matters.
b. Collaborates with the USD(R&E) to formulate concrete, testable requirements for implementing the DoD AI Ethical Principles and the DoD Responsible AI Strategy and Implementation Pathway.
c. Establishes policy and issues guidance on definitions of requirements and testability for AI-enabled systems to implement and demonstrate adherence to the DoD AI Ethical Principles and the DoD Responsible AI Strategy and Implementation Pathway.
d. Issues guidance on T&E practices for AI capabilities in autonomous or semi-autonomous weapon systems.
e. Coordinates with the USD(R&E) and DOT&E on developing and using common tools and infrastructure for T&E and V&V of AI capabilities in autonomous or semi-autonomous weapon systems.
SECTION 3: VERIFICATION AND VALIDATION AND TESTING AND EVALUATION OF AUTONOMOUS AND SEMI-AUTONOMOUS WEAPON SYSTEMS
Regardless of the acquisition pathway or OSD T&E oversight status for a given weapon system, to ensure autonomous and semi-autonomous weapon systems function as anticipated in realistic operational environments against adaptive adversaries and are sufficiently robust to minimize failures:
This is new. DoD has many different acquisition pathways with varying degrees of oversight from OSD. This clause ensures that regardless of how a weapon system is acquired, even if it normally wouldn’t require OSD involvement, the Directive’s V&V and T&E requirements still apply.
a. Systems will go through rigorous hardware and software V&V and realistic system developmental and operational T&E, including analysis of unanticipated emergent behavior.
(1) Hardware and software V&V will include iterative cyber T&E in accordance with DoDI 5000.89, to verify that the weapon system is resilient and survivable in contested cyberspace.
(2) Systems incorporating AI capabilities will go through rigorous developmental and operational T&E to verify and validate that the AI is robust according to design requirements.
b. T&E of systems incorporating AI capabilities will include testing to confirm that their autonomy algorithms can be rapidly reprogrammed on new input data.
These portions are new. DoD has evolved its understanding of T&E for cyber and AI considerably over the last decade.
c. After IOT&E, as directed by the DOT&E, system data will be collected and any further changes to the system will undergo appropriate V&V and T&E to ensure that critical safety features have not been degraded.
(1) System software will be tested using best-available DoD means and methods to validate that critical safety features have not been degraded. Automated testing tools, such as modeling and simulation, will be used whenever feasible. The testing will identify any new operating states and other relevant changes in the autonomous or semi-autonomous weapon system.
The old Directive had specific guidance on methods for testing system software. The new Directive gives DoD flexibility on testing methods, an important change given how fast-moving the technology is.
(2) As directed by the DOT&E:
(a) Each new or revised operating state will undergo appropriate and tailored additional T&E to characterize the system behavior in that new operating state.
(b) Changes to the state transition matrix may require whole system follow-on operational T&E.
d. In coordination with the USD(R&E) and DOT&E, the owning Component will provide for monitoring to identify and address when changes to the system design or operational environment require additional T&E to provide sufficient confidence that the system will continue to avoid unintended engagements and resist interference by unauthorized parties.
SECTION 4: GUIDELINES FOR REVIEW OF CERTAIN AUTONOMOUS WEAPON SYSTEMS
4.1. Autonomous weapon systems intended to be used in a manner that falls outside the policies in Paragraphs 1.2.d.(1) through 1.2.d.(4) must be approved by the USD(P), USD(R&E), and VCJCS before formal development and by the USD(P), USD(A&S), and VCJCS before fielding. If the weapon system in question is to be developed and then fielded by DoD, it will need to undergo both reviews and receive approvals. A review is not needed if the weapon system is covered by a previous approval for formal development or fielding. Requests for senior review and approval should be submitted to USD(P), attention to the Director of the Emerging Capabilities Policy Office.
This is new and addresses a question from the old Directive about who a DoD component should go to in order to start the review and approval process.
a. An autonomous weapon system that is a variant of an existing weapon system previously approved through this review will not be covered by previous approval if changes to the system algorithms, intended mission set, intended operational environments, intended target sets, or expected adversarial countermeasures substantially differ from those applicable to the previously approved weapon system so as to fall outside the scope of what was previously approved in the senior review. Such systems will require a new senior review before their formal development and again before fielding.
b. An autonomous weapon system that is a modification of an existing weapon system not previously approved through this review requires the senior review described in Paragraph 1.2.c unless it is intended to be used in a manner that falls within the policies in Paragraphs 1.2.d.(1) through 1.2.d.(4).
c. Before a decision to enter formal development, the USD(P), USD(R&E), and VCJCS will verify that:
(1) The system design incorporates the necessary capabilities to allow commanders and operators to exercise appropriate levels of human judgment over the use of force in the envisioned planning and employment processes for the weapon.
This is new. As is the case with other weapons, how autonomous weapons are used can have a significant impact on their safety and even their lawfulness. The weapon should be considered in the context of an intended use.
(2) The system is designed to complete engagements within a timeframe and geographic area, as well as other applicable environmental and operational parameters, consistent with commander and operator intentions. If unable to do so, the system will terminate engagements or obtain additional operator input before continuing the engagement.
(3) The combination of the system’s design and concept of employment (e.g., its target selection and engagement logic and other relevant processes or measures) accounts for risks to non-targets, consistent with commander and operator intent.
This requirement is new and is an important addition. The essence of an autonomous weapon is a weapon that selects and engages targets on its own. Much of the concern about autonomous weapons is the risk that it attacks something other than its intended target. This clause addresses that specifically.
(4) The system design, including system safety, anti-tamper mechanisms, and cybersecurity in accordance with DoDI 8500.01, addresses and minimizes the probability and consequences of failures.
(5) Plans are in place for V&V and T&E to establish system reliability, effectiveness, and suitability under realistic conditions, including possible adversary actions, to a sufficient standard consistent with the potential consequences of an unintended engagement or unauthorized parties interfering with the operation of the system.
(6) For systems incorporating AI capabilities, plans are in place to ensure consistency with the DoD AI Ethical Principles and the DoD Responsible AI Strategy and Implementation Pathway.
(7) A preliminary legal review of the weapon system has been completed in coordination with the GC DoD and in accordance with DoDD 5000.01, DoDD 2311.01 and, where applicable, DoDD 3000.03E.
d. Before fielding, the USD(P), USD(A&S), and VCJCS will verify that:
(1) System capabilities, human-machine interfaces, doctrine, TTPs, and training have been demonstrated to allow commanders and operators to exercise appropriate levels of human judgment over the use of force and to employ systems with appropriate care and in accordance with the law of war, applicable treaties, weapon system safety rules, and ROE that are applicable or reasonably expected to be applicable.
(2) System safety, anti-tamper mechanisms, cyber survivability, operational resilience, and cybersecurity capabilities have been implemented in accordance with DoDI 5000.83, the Joint Capabilities Integration and Development System Manual, and DoDI 8500.01 to minimize the probability and consequences of failures. A monitoring regime is in place to identify and address changes in operational environment, data inputs, and use that could contribute to such failures.
This is new and yet another area where the Directive adds post-fielding requirements to monitor for changes that could cause failures.
(3) V&V and T&E:
(a) Assess system performance, capability, reliability, effectiveness, and suitability under realistic conditions, including possible adversary actions, consistent with the potential consequences of unintended engagement or unauthorized parties interfering with the operation of the system.
(b) Have demonstrated that the system can be reprogrammed with sufficient rapidity to enable timely correction of any unintended system behaviors that may be observed or discovered during future system operations.
This is new and is another area where DoD requires the system have the flexibility to be upgraded to account for changes or new failures that are uncovered during operations.
(4) Adequate training, TTPs, and doctrine are available, periodically reviewed, and used by system operators and commanders to understand the functioning, capabilities, and limitations of the system’s autonomy in realistic operational conditions.
(5) System design and human-machine interfaces are readily understandable to trained operators, provide transparent feedback on system status, and provide clear procedures for trained operators to activate and deactivate system functions.
(6) For systems incorporating AI capabilities, the deployment and use of the AI capabilities in the weapon system will be consistent with the DoD AI Ethical Principles and the DoD Responsible AI Strategy and Implementation Pathway.
(7) A legal review of the weapon system has been completed, in coordination with the GC DoD, and in accordance with DoDD 5000.01, DoDD 2311.01, and, where applicable, DoDD 3000.03E.
4.2. In cases of urgent military need, the USD(P), USD(A&S), USD(R&E), or VCJCS may request a Deputy Secretary of Defense waiver of the requirements in this section and Paragraph 1.2.c.
The new Directive retains this section, which effectively acts as an escape clause from the approval process entirely. If there is an urgent need, the Deputy Secretary of Defense can waive the requirement for a review. This clause existed in the old Directive, although there is a subtle change in the new one.
The old Directive stated that “USD(P), USD(AT&L), and CJCS” may request a waiver. The new Directive replaces “and” with “or” and pushes the level of authority from CJCS down to VCJCS. In theory, this means that any one of the approving authorities can ask the Deputy Secretary for the waiver. They don’t all need to agree.
4.3. Figure 1 illustrates the senior review process and can help determine whether a weapon system needs to undergo senior review.
This flowchart is new. It’s very helpful!
SECTION 5: AUTONOMOUS WEAPON SYSTEM WORKING GROUP
5.1. GENERAL.
The Autonomous Weapon System Working Group will:
a. Support the USD(P), the USD(R&E), and the VCJCS in considering the full range of relevant DoD interests during the review of autonomous weapon systems before formal development.
b. Support the USD(P), the USD(A&S), and the VCJCS in considering the full range of relevant DoD interests during the review of autonomous weapon systems before fielding.
The working group is not a decision-making body, but is tasked with supporting the decisionmakers during the review process.
c. When requested by appropriate representatives of the Secretaries of the Military Departments; the Commander, USSOCOM; or, when applicable, a Director of a Defense Agency or a DoD Field Activity:
This a list of entities in DoD who might develop a weapon system that could be subject to review.
(1) Advise whether a given weapon system requires senior-level approval in accordance with this directive.
One of the problems with the old Directive is that it outlined a review process, but it wasn’t clear how to start that process or even where to go to ask if a weapon system needed to go through the process or not. The new Directive establishes an Autonomous Weapon System Working Group that can help advise other parts of the DoD on whether or not their weapon requires review and, if so, how to move it through the process.
(2) Help identify and advise on addressing potential issues presented by a given weapon system during a potential senior-level review in accordance with this directive.
5.2. MEMBERSHIP.
In addition to representatives of the USD(P), the Autonomous Weapon System Working Group will consist of representatives of each of the following officials listed below. All members of the working group will be full time Federal Government employees, permanent part-time Federal Government employees, or Service members on active duty. The parent organizations for the representatives will be responsible for any expenses, to include travel related expenses, associated with participation in the working group:
a. USD(A&S).
b. USD(R&E).
c. GC DoD.
d. CDAO.
e. DOT&E.
f. CJCS representatives from:
(1) Director for Strategy, Plans and Policy (Joint Staff J5).
(2) Director, Command, Control, Communications and Computers/Cyber, Chief Information Officer (Joint Staff J6).
(3) Director for Force Structure, Resources and Assessment (Joint Staff J8). (4) Legal Counsel to the Chairman of the Joint Chiefs of Staff.
The working group membership includes a wider set of organizations in DoD than those actually tasked with deciding the review. This is fairly typical for internal DoD processes to ensure that all relevant stakeholders have a voice in the decision.
Glossary
Autonomous Weapon System
A weapon system that, once activated, can select and engage targets without further intervention by an operator. This includes, but is not limited to, operator-supervised autonomous weapon systems that are designed to allow operators to override operation of the weapon system, but can select and engage targets without further operator input after activation.
The core of the definition of an autonomous weapon has *not* changed. Consistent with other parts of the Directive, “human” has been replaced with “operator” in this definition as well.
Failure
An actual or perceived degradation or loss of intended functionality or inability of the system to perform as intended or designed. Failure can result from a number of causes, including, but not limited to, human error, faulty human-machine interaction, malfunctions, communications degradation, software coding errors, enemy cyber-attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield. For the purposes of this issuance, minimizing the probability and consequences of failure means reducing the probability and consequences of unintended engagements to acceptable levels while meeting mission objectives and does not mean achieving the lowest possible level of risk by never engaging targets.
This is new. This specifies that “minimizing” failures means reducing them to “acceptable levels” while still accomplishing the mission. Zero failures is not expected.
Fielding
Making a weapon system available for, or placing it into, operational use (rather than testing, exercises, or experiments), regardless of the acquisition approach employed for the weapon system, including major defense acquisition programs, middle tier acquisitions, or prototyping efforts such as joint concept technology demonstrations.
“Fielding” was not defined in the old Directive. Consistent with other parts of the Directive, this clarifies that requirements relating to “fielding” apply regardless of the acquisition approach used.
Formal Development
Begins at “Milestone B,” as described in Paragraph 3.10 of DoDI 5000.85, in the case of major defense acquisition programs. For cases other than major defense acquisition programs, begins after the preliminary design review that correlates with the end of the technology maturation and risk reduction phase.
“Formal development” also was not defined in the old Directive. This is another revision that adds clarity to the review process.
Operator-Supervised Autonomous Weapon System
An autonomous weapon system that is designed to provide operators with the ability to intervene and terminate engagements, including in the event of a weapon system failure, before unacceptable levels of damage occur.
Other than replacing “human” with “operator”, this definition has *not* changed.
Operator
A person who operates a platform or weapon system.
Here “operator” is defined as “a person”. (No need to worry about bots controlling bots!)
Semi-Autonomous Weapon System
A weapon system that, once activated, is intended to only engage individual targets or specific target groups that have been selected by an operator. This includes:
Weapon systems that employ autonomy for engagement-related functions including, but not limited to, acquiring, tracking, and identifying potential targets; cuing potential targets to operators; prioritizing selected targets; timing of when to fire; or providing terminal guidance to home in on selected targets, provided that operator control is retained over the decision to select individual targets and specific target groups for engagement.
“Fire and forget” or lock-on-after-launch homing munitions that rely on TTPs to maximize the probability that the only targets within the seeker’s acquisition basket when the seeker activates are those individual targets or specific target groups that have been selected by an operator.
The substance of this definition has *not* changed.
Specific Target Group
A discrete group of potential targets, such as a particular flight of enemy aircraft, a particular formation of enemy tanks, or a particular flotilla of enemy vessels. A general class of targets or a specific type of target, such as a particular model of tank or aircraft, does not constitute a specific target group.
This definition is new. The term “specific target groups” comes up in the definition of a semi-autonomous weapon and has been a source of considerable confusion. This new definition is intended to clarify the term and, by extension, how much autonomy a weapon can have and still be classified as a semi-autonomous weapon system. This is, in my opinion, the most significant clarification in the Directive, since it gets right to the core of the question, “what is an autonomous weapon?”
In January, the U.S. Defense Department (DoD) released an updated version of its policy on autonomous weapons, DoD Directive 3000.09, Autonomy in Weapon Systems. This was the first major policy update since 2012.
In this CNAS Noteworthy, Vice President and Director of Studies Paul Scharre breaks down the new Directive and what it means for the U.S. military’s approach to lethal autonomous weapon systems. Dr. Scharre led the DoD working group that drafted the original DoD Directive 3000.09 in 2012.
Special thanks to CNAS Project Assistant Noah Greene for providing valuable research support.