Skip to main content

Science, Technology and Industry (Spotlight Brief 1/21)

Clausewitzian friction and autonomous weapon systems

Source: Comparative Strategy – Jan 21

Friction is a constant companion in the human activity of war. A lot of effort seeks to reduce this friction, with autonomous systems and weapons being amongst the latest solutions. It is a tempting argument: an autonomous system can be free of human frailty and errors, focusing on a single mission with relentless mechanical application. As with all solutions though, these systems bring in their own friction sources – including some that may unintentionally contribute to escalation or uncertainty. Compounding this is the simplest solution, retaining a human within or on the loop, undermines a forces capability, especially against a threat using AI themselves. We know new capabilities almost inevitably introduce new complexities – it follows that AI and autonomous weapons may not be the ‘fog-clearing’ panacea many claim.

Further reading:

‘Illiteracy, Not Morality, Is Holding Back Military Integration of Artificial Intelligence’, National Interest, 15 Feb 21

‘Artificial intelligence and ML in workplace: major trends for 2021’, Analytics Insight, 22 Jan 21

‘The Four Mistakes That Kill Artificial Intelligence Projects’, Forbes, 16 Dec 20

‘War machines: can AI for war be ethical?’, The Cove, 14 Dec 20

‘Will Artificial Intelligence Ever Live Up to Its Hype?’, Scientific American, 04 Dec 20

The views expressed in this article and subsequent comments are those of the author(s) and do not necessarily reflect the official policy or position of the Australian Army, the Department of Defence or the Australian Government.

Using the Contribute page you can either submit an article in response to this or register/login to make comments.