ISBN: 978-0-69126-514-8

Amer Alnajar

Kennesaw State University

Read more about this book at press.princeton.edu

King's well-sourced book draws on 126 interlocutors across defense, government, think tanks, and the tech industry in the US, UK, and Israel. His thesis, that AI augments rather than automates warfare and that the consequential transformation is organizational, provides a corrective to the speculative futurism dominating this field. The concept of a ‘military-tech complex’, updating Eisenhower's framework for an era in which private technology firms sit inside operational kill chains, is a genuine contribution. It is revealing that King's own evidence from Gaza reveals a tension his argument does not resolve: the Gospel system increased targeting from 50 to 100 a year to 100 a day; Lavender escalated this to 428 daily targets, with officers given just 20 seconds to confirm whether a target was male before authorizing strikes (pp. 125–127). At that volume and speed, the human role narrows to ratifying what the algorithm has already selected. One would presume that automation bias, a well-documented phenomenon in human-factors research, would receive sustained treatment in a book that so meticulously documents its operation, but King mentions it once (p. 8) and never applies it to his own cases. He barely touches the matter of international humanitarian law, accountability, or governance proposals, reporting sobering civilian casualty figures yet leaving the normative work to others. King contends that even when commanders defer to AI, ‘that is a collective human choice’ (p. 152), and the sociological insight is genuine. But choosing to defer and exercising meaningful oversight are not the same, and whether that distinction carries normative weight before algorithmic tempo renders it academic is the question this book opens but does not answer.