In New Paper, Professor Ashley Deeks Examines Risks of Machines Initiating War
War could unfold so rapidly in the future that nations may need to rely on machines and artificial intelligence to make split-second decisions to repel or carry out attacks.
In her paper, Deeks urges Congress to create laws ensuring strong oversight to minimize risks associated with the use of AI in the resort to force. Both in the U.S. and abroad, governments should bear in mind the potential challenges from using AI systems in initiating force, she says.
Deeks is a senior fellow with UVA’s Miller Center and directs the Law School’s National Security Law Center. In a Q&A, she answers questions about the realities of delegating war initiation to machines and the ethical and policy implications surrounding such decisions.
What inspired you to explore the topic of delegating war initiation to machines?
I recently wrote a longer article that examined national security delegations by the president, which often are classified but which can raise constitutional concerns. For example, President Eisenhower delegated the authority to launch nuclear weapons to seven military officials, and more recently presidents have delegated to the military the authority to do things such as launch offensive cyber operations without presidential approval.
These delegations raise interesting legal questions: Are they constitutional? May Congress limit them? If not, can it at least mandate that the president report these delegations to Congress? The delegations may raise concerns because they can dilute civilian command over the use of military force; lead to situations in which the president’s agent does not act in a way that reflects the president’s intent; and obscure the actual decisionmaker’s identity in a particular case.
At the end of that article, I briefly consider whether using autonomous systems to conduct self-defense actions may actually be a type of national security delegation — only to a machine, rather than to a military official. This shorter article picks up and explores that idea. For example, a state might decide to allow its nuclear command and control system to make autonomous judgments about when to launch a nuclear weapon in response to a perceived imminent attack. Or a state could allow its cyber systems to respond autonomously to certain attacks on its military installations, setting off pre-placed implants that cause physical damage to the attacker. Any state that considers introducing significant autonomy into systems like this needs to assess whether and how the use of autonomy in war initiation would comport with its domestic laws governing delegations of decisions to use force.
Can you explain what “hyperwar” means and why it might necessitate the use of autonomous systems? ?
Read more
Also
Delegating War Initiation to Machines
Virginia Public Law and Legal Theory Research Paper No. 2024-60
78 Australian Journal of International Affairs 148 (forthcoming 2024)
10 Pages Posted: 11 Sep 2024
Date Written: February 26, 2024
Abstract
The use of autonomy to initiate force, which states may begin to view as necessary to protect against hypersonic attacks and other forms of “hyperwar,” may effectively constitute a delegation of war-initiation decision making to a machine. Yet legal questions about whether and when the leader of a country may delegate their decision making to others – and normative questions about whether he should do so – can be complicated. Any state that intends to introduce significant autonomy into such systems should assess whether and how the use of autonomy in war initiation comports with its domestic laws and norms that govern the delegation of the use of force.
Suggested Citation: