Strategy (Spotlight Brief 6/21)
The content in this article is an extract of Spotlight Brief 6/21.
Cyber Threats and Vulnerabilities to Conventional and Strategic Deterrence
Joint Force Quarterly – Jul 2021
Information technology has contributed to a transformation in military capabilities and power. An information technology counter-revolution is now underway, whereby digitally dependent militaries have become increasingly vulnerable to new types of threats. Mark Montgomery and Erica Borghard break down how adversaries of the United States are attempting to leverage such vulnerabilities to gain strategic advantage. The United States’ military modernisation policy has led to highly technologically advanced capabilities and thus a large attack surface in cyberspace for exploitation. The computerised and networked nature of weapons systems has increased the number of access points for cyber intrusion and attacks that could hold key infrastructure at risk and compromise mission assurance. Montgomery and Borghard argue that there are four categories of vulnerabilities: technical vulnerabilities in fielded weapon systems, technical vulnerabilities across networked platforms, supply chain vulnerabilities, and nontechnical vulnerabilities stemming from information operations. They conclude by unpacking several specific measures to address the most pressing concerns that they raise.
Related:
‘The Information Technology Counter-Revolution: Cheap, Disposable, and Decentralized,’ War on the Rocks, 19 Jul 21
‘The Challenge of Educating the Military on Cyber Strategy,’ War on the Rocks, 25 Jun 21
‘Hacked Drones and Busted Logistics are the Cyber Future of Warfare,’ Brookings, 04 Jun 21
‘Data Poisoning: When Attackers Turn AI and ML against You,’ Security Intelligence, 21 Apr 21
‘Extending Human Performance through Technology: The Promost of JADC2,’ Air Force, 09 Dec 20
Autonomous Weapons Systems and the Contextual Nature of Hors de Combat Status
Information – May 2021
Who and who is not a combatant is becoming increasingly harder to define thanks to the longer range and greater types of weaponry in service. In this article, Umbrello and Wood seek to reconcile Autonomous Weapon Systems with hors de combat in international law. A substantial portion of their article is devoted to establishing precisely who should be considered hors de combat (out of combat). They argue for a broad (and somewhat counterintuitive) understanding of hors de combat, whereby any combatant who cannot harm their adversary falls into this category. Umbrello and Wood acknowledge that such an interpretation requires a nuanced and contextualised case-by-case evaluation. They contend that current Autonomous Weapon Systems do not possess a level of technical sophistication to make such fine-grained evaluations, so any attempt to remove humans from the decision-making process risks violating international law. Umbrello and Wood also highlight that as Autonomous Weapon Systems become more and more capable, it will increasingly be the case that they will be impervious to attempts to harm them. If Umbrello and Wood’s relational understanding of hors de combat is accepted, this will mean that most combatants facing Autonomous Weapon Systems will actually be deemed hors de combat in the future, providing a complex ethical concern.
Related:
‘Down Is Not Always Out: Hors De Combat in the Close Fight,’ Articles of War, 08 Jul 21
‘Artificial Intelligence and Automated Systems Legal Update 1Q21,’ Gibson Dunn, 23 Apr 21
‘Down Is Not Always Out: An Infantry Leader’s Guide to Persons Hors De Combat Under the Law of War,’ Infantry, 01 Mar 21
‘Drone and War: The Impact of Advancement in Military Technology on Just War Theory and the International Law of Armed Conflict,’ Ethics and International Affairs, Sep 20
Clausewitzian Friction and Autonomous Weapon Systems
Comparative Strategy – Jan 2021
Gardner investigates the impact that Autonomous Weapon Systems will have on friction in war. Friction is one of Clausewitz’s most intuitively accessible concepts. It encompasses all of the factors that distinguish war in practice from war in theory. The presence of humans in war necessarily generates a lot of friction. Humans suffer from diverse physical and cognitive limitations. The human brain has processing limitations and must often grapple with a shortage of accurate and relevant information. The magnitude and effects of these frailties increase under the intense stresses, pressures, emotions, fatigue, and combat responsibilities. Gardner notes that the advent of Autonomous Weapon Systems will decrease the role of humans in war, and considers whether this will lead to a reduction in friction. He ultimately concludes that Autonomous Weapon Systems will alleviate some aspects of friction but will amplify others. Friction will persist due to the limitations of artificial intelligence concerning recognising objects and understanding context. Focusing on context, Gardner cautions that Autonomous Weapon Systems have the potential to drastically magnify the risk of inadvertent escalation in conflict due to their inability to appreciate the (potentially disastrous) consequences of actions. This article concludes that friction is an enduring structural feature of war, which will persist due to the limitations of humans, the shortcomings of machines, and the interaction between the two.
Related:
‘Clausewitz and Centres of Gravity: Turning the Esoteric into Practical Outcomes,’ Grounded Curiosity, 13 Jun 21
‘Clausewitz and the Strategic Deficit,’ Wavell Room, 21 May, 21
‘The Role of Bias in Artificial Intelligence,’ Forbes, 04 Feb 21
‘Strategy, War, and the Relevance of Carl von Clausewitz,’ Military Strategy Magazine, 01 Dec 21
‘Can We Make Our Robots Less Biased Than We Are?’ The New York Times, 22 Nov 20
‘Catalytic Nuclear War’ in the Age of Artificial Intelligence & Autonomy: Emerging Military Technology and Escalation Risk between Nuclear-Armed States
Journal of Strategic Studies – Jan 2021
Johnson argues that the possibility of catalytic nuclear war has increased due to advancements in artificial intelligence and Autonomous Weapon Systems. Nuclear war is defined as catalytic when it is set in motion by the deliberate action of a third-party actor. The third-party deceptively induces nuclear war between States to gain some form of perceived advantage. Johnson posits several ways that a non-state actor could utilise or exploit artificial intelligence or Autonomous Weapon Systems to generate escalation pathways of reaction and retaliation that result in a nuclear exchange. Technology augmented with artificial intelligence could facilitate digital jamming, spoofing, malware attacks, data pollution, deepfake propaganda, false alarms, reverse engineering algorithms, and the spreading of malevolent disinformation. Johnson employs fictional cases to demonstrate how these technologies could be harnessed to manipulate governments into resorting to nuclear weapons. The final section of Johnson’s article considers specific measures he recommends implementing to reduce the risk of catalytic nuclear escalation. These include enhancing training, consulting independent sources to verify and collaborate threat assessments, collective monitoring of events, bilateral and multilateral data exchanges, and ensuring humans retain a fundamental role in the decision to deploy nuclear weapons.
Related:
‘Talks for New US-Russia Arms Deal to Stir Up Old Bugaboo,’ The Middletown Press, 27 Jul 21
‘The Future of War and Deterrence in an Age of Autonomous Weapons,’ TRT World, 17 Jun 21
‘Renewed US-Russia Nuke Pact Won’t Fix Emerging Arms Threats,’ Independent, 28 Jan 21
‘Russia’s Impact on US National Interests: Preventing Nuclear War and Proliferation,’ Russia Matters, 21 Jan 21
‘U.S. Nuclear Weapons Agency Hacked by Suspected Russians,’ Fortune, 18 Dec 21
National Power After AI
Center for Security and Emerging Technology – Jul 21
There is currently an enormous amount of work in trying to determine what artificial intelligence can do for nations, and what the effect is across a variety of fields. Every aspect of national strategy will change: economy, defence, education and health among others. Nevertheless, little consideration has been to what is next, to what happens after AI has rolled out. Here, CSET attempts to answer that through the scope of international competition. Comparing the effect to the environmental change of an ice age, it stresses that we must do better to scope possible outcomes to best set-up the nation. Simply using AI to make extant processes better may not work; AI may fundamentally change what a nation wants, as much as what it can do. It also highlights that AI offers different nations different things, meaning that a deep understanding of the aims and interests of allies and partners is even more important.
Related:
‘Rethink ‘Power’ in a New Era of ‘Great Power Competition’’, National Interest, 28 Aug 21
‘Big Tech’s Stranglehold on Artificial Intelligence Must Be Regulated’, Foreign Policy, 11 Aug 21
‘The promise and perils of Artificial Intelligence partnerships’, Observer Research Foundation, 23 Jun 21
‘Techtonic 2.0: National Artificial Intelligence Summit’, Department of Industry, Science, Energy and Resources, 18 Jun 21
‘Artificial Intelligence and National Security’, Institute on Global Conflict and Cooperation, 16 Apr 21
The views expressed in this article and subsequent comments are those of the author(s) and do not necessarily reflect the official policy or position of the Australian Army, the Department of Defence or the Australian Government.
Using the Contribute page you can either submit an article in response to this or register/login to make comments.