Descent Into the Matrix: The Death of Law, Accountability, and Morality in Warfare

author Steven W. Becker
journal RIDP (ISSN: 0223-5404)
volume 2025
issue International perspectives on AI: challenges for judicial cooperation and international humanitarian and criminal law
section Special Report
date of publication Jan. 28, 2026
language English
pagina 13
abstract

The author argues for a complete ban on the use of fully auto nomous weapon systems in warfare. The newest AI models have shown a proclivity to lie, deceive, and even blackmail in order to maintain self-preservation when faced with removal or replacement. In light of this behaviour, there is no way to guarantee that such lethal weapon systems will remain neutral toward their human creators, especially in light of the unknowns of deep reinforcement learning. Given these uncertainties, scholars have proposed establishing a new legal regime to handle the criminal conduct of such lethal weapon systems during warfare. Harkening back to the previous era of state responsibility, commentators suggest that war torts could help to fill in the gap, provide compensation to victims, and incentivise governments and contractors to curtail their usage and manufacture of such fully autonomous weapon systems. Toward this end, an International Court of Claims could be established to adjudicate these war tort claims. The author also proposes that any such enabling legislation include a ‘citizen suit’ provision, similar to those contained in environmental statutes, to allow for legal standing for victims and their families to sue for damages.