Skip to main content

Upgrading the Army’s Fires Lethality

Journal Edition
DOI
doi.org/10.61451/2675095

How the Australian Army Can Harness the Firepower Advantages of the Fourth Industrial Revolution

Introduction

Since 2011, the world has undergone a Fourth Industrial Revolution (4IR) which has heralded worldwide advances in artificial intelligence (AI), automation, and robotic technologies.[1] These disruptive technologies are gradually altering the character of warfare towards what AI entrepreneur Amir Husain describes as ‘hyper war’, where battles are waged entirely at machine speed.[2] The question of how 4IR technologies can advance indirect fires and targeting capabilities within the Australian Army is an important facet of this innovation. The incorporation of 4IR will have the greatest significance to two aspects of army capability. Firstly, autonomous weapons can supplement the Army’s indirect firepower deficiencies as it readies itself for large-scale combat operations (LSCO). Secondly, AI can enhance the Army’s targeting capabilities by providing improved situational awareness, kill-chain responsiveness, and strike integration. Harnessing the potential of these advancements will, however, require Australia to resist the temptation to view 4IR as a panacea for all of the military challenges it may face in the 21st century. It will also require significant effort by the Army to effectively integrate such technologies into its inventory, including the implementation of appropriate control measures.

This article is divided into three sections. The first section describes how 4IR technologies can help bridge the Army’s artillery firepower deficiencies as it readies for LSCO. The second section describes the aspects of targeting that can be significantly enhanced through the integration of 4IR programs. The final section discusses the political, ethical and procedural challenges currently impeding the integration of 4IR technologies.

The Australian Army’s Firepower Deficiency

As outlined in the 2023 Defence Strategic Review (DSR), the Australian Army once again faces a heightened probability of high-intensity conflict.[3] Should this risk be realised, it is likely that the Army will need to deliver greater concentrations of indirect fire than it can currently generate. Currently, the Army’s organic indirect fire support capability is based on 36 towed howitzers and a smattering of 81 mm mortar platoons. In LSCO terms, this represents a modest firepower complement. For comparison, the 8th Australian Division coordinated approximately 70 howitzers of various types during the battle for Singapore in 1941.[4] Similarly, the most effective Australian brigades in the Pacific battles of World War II were supported by two artillery regiments, with a third in reserve.[5]Even a high-intensity counter-insurgency operation, the company defence at Long Tan, involved over 30 guns firing a total of 3,400 rounds in 24 hours.[6] A heavy reliance on close air support during the Army’s more recent combat operations has diminished the perceived importance of artillery to the Australian Defence Force (ADF). However, should LSCO occur, the availability of air support will be uncertain, and the demand for the Army’s limited indirect fire support will likely escalate. Compounding this limitation, it is likely that materiel support from Australia’s allies would be slow to arrive should conflict occur concurrently in the Indo-Pacific region. This reality was borne out during World War II, when Australia struggled to receive required armaments due to the higher priority placed by its allies on the demands of the European theatre of operations.[7] Although the Ukraine conflict has revived Western arms production, the expansion of the West’s military industrial output will still take several years to reach the volumes necessary to sustain LSCO.[8] Therefore, the stark disparity between the Army’s current on-hand artillery and that which it has historically fielded during conflict represents a deficiency that will restrict Australia’s capacity to conduct contemporary LSCO. To bridge this gap, the Army should seek to develop options for fire support augmentation that are feasible in the light of ongoing recruitment, retention, industrial and fiscal challenges.

Loitering Autonomous Weapons

To upscale its firepower quickly and efficiently, the Australian Army has the option to acquire lethal autonomous weapons (LAWs), commonly known as drones. The combination of expendability, affordability and availability of LAWs makes them a viable option to address deficiencies in the Army’s artillery delivery systems and ammunition. For one thing, LAWs can be fielded rapidly and inexpensively, making them suitable for high battlefield attrition. For example, since 2022 the Australian company SYPAQ has been supplying Ukraine with 100 Corvo Precision Payload Delivery System drones per month, at a cost of US$3,500 per aircraft.[9] Another Australian Company, DefendTex, produces the D40 ‘low cost’ loitering munition, which has a range of 20 kilometres and carries a 40 mm grenade warhead with enough yield to render a howitzer inoperable with a direct hit.[10] The low cost of drones can be contrasted with the relatively high price of artillery ammunition. Specifically, the Army’s Project LAND 17 Phase 1C.2 contract acquired a mere 2,504 rounds for US$148 million.[11] This averages approximately US$3,000 per round, which is roughly equivalent to the cost of one loitering drone. Further, to obtain near-precision accuracy, each artillery round requires a precision guidance kit that costs around US$20,000. While LAWs like the D40 carry a smaller explosive payload than artillery shells, their point target accuracy is far higher than the unguided effects achievable by conventional artillery munitions. Additionally, for heavily defended targets, LAWs can conduct saturation attacks where several drones simultaneously strike a target to overwhelm its defences, as demonstrated by Iran’s drone strikes against Saudi Arabia’s Patriot-defended Abqaiq oil refinery in 2019.[12]

Given the many advantages of LAWs, their integration into the Australian Army should focus on two key areas: firstly as a platoon-level fires supplement, and secondly to support deep shaping fires. To address the first usage, LAWs could provide manoeuvre forces with an additional precision fires asset that is low cost and readily available. For example, the D40 loitering drone could be carried by infantry and armoured personnel to strike targets that would normally only be within range of mortars or howitzers. This capability would reduce the demand for artillery and mortar fire, thereby improving artillery survivability and decreasing ammunition consumption. The United States Marine Corp’s current experimentation with Switchblade loitering munitions at the platoon level speaks to the potential of this technology.[13] Furthermore, should the Australian Army be committed to amphibious combat in the future, it will likely face significant logistical challenges in deploying heavy fire support assets like tanks and howitzers. To address this challenge, the integration of loitering munitions within manoeuvre teams would provide an immediate fire support option when howitzers, tanks and mortars are unavailable.

Loitering munitions have the potential to address Army’s inability to conduct deep fires using its own artillery. The term deep fires refers to effects delivered 30 kilometres beyond the forward line of own troops. Due its limitations in howitzer range, the Army has not typically trained for deep shaping operations using its own artillery. Instead it has relied on the Royal Australian Air Force and the Royal Australian Navy. It is fair to assume, however, that on operations a conventional adversary would have the capacity to deny Australian air and naval forces the opportunity to shape the land battle. It is reasonable to predict that in this situation Army would be unable to support the close fight while simultaneously providing shaping and interdiction fires. Army’s forthcoming acquisition of the high-mobility artillery rocket system (HIMARS) will give it the capacity to achieve some level of deep shaping; however, the capability is unlikely to fully address its tactical requirements. This is because the DSR has flagged that HIMARS will be primarily focused on strategic deterrence-by-denial tasks.[14] The result is a firepower gap at the divisional level. This gap is made more challenging by Australia’s reliance on foreign manufacturers for artillery components. Acquisition of such equipment would inevitably become vulnerable to supply chain disruption if tensions were to escalate.[15]

The opportunity to use LAWs to achieve tactical deep shaping effects presents efficiencies to the Army as it allows the small fleet of howitzers to be concentrated on the tactical close fight, and HIMARS to be focused on strategic deterrence. Drones such as Israel’s Harpy and Germany’s HERO boast endurances spanning hours and ranges nearing 100 kilometres, making them an ideal deep shaping instrument within the decentralised littorals of the Pacific.[16] Azerbaijan’s use of LAWs in 2020 to systematically destroy Armenia’s air defence network within 48 hours offers a striking example of the deep shaping potential of LAWs.[17] Moreover, the ability to pre-program and evasively manoeuvre groups of LAWs enables large areas of the battlefield to be held at risk, aiding the divisional screen and covering force battles. Finally, swarms of small, inexpensive drones are more difficult to target than expensive air defence systems, making LAWs effective in contested airspaces.

Automated Artillery Systems

The development of automated artillery systems (such as Sweden’s Archer and Germany’s Remote Controlled Howitzer (RCH) artillery platforms) represents another important 4IR innovation.[18] These systems are distinguishable from their fully crewed counterparts in that they leverage robotisation and automation to undertake functions normally performed by humans, such as loading, laying and firing. Such systems offer several benefits to the Army, as has been clearly demonstrated in high counter-battery threat environments such as that which currently exists in Ukraine.[19] In such settings, artillery is at greatest risk when it is firing and then when it moves into hides.[20] Automated artillery lowers the casualty risk by reducing the number of personnel exposed to counter-battery fire. Added benefits include the fact that automated machines are unaffected by human limitations such as hunger, fatigue and loss of morale. Further, they can continue to function at times when human crews may be suppressed by enemy fire.[21] Given these characteristics, automated systems are well placed to complement a larger fleet of crewed platforms because they can deliberately draw out enemy counterfires and sensors without risking casualties among friendly troops.

While the promise of casualty mitigation holds considerable appeal, automated systems do have their limitations. For one, they are complex and therefore likely be more expensive than crewed weapons. Furthermore, while automated systems have the potential to lower the danger to gun crews, the risk would likely be redirected to the larger teams of maintainers that would be needed to support the weapons.[22] Reducing this risk would depend on the establishment of hides where maintenance and resupply could be conducted in relative security. Notwithstanding these caveats, the automated artillery technology is worth serious consideration as designs mature and reliability improves.

Targeting

While 4IR technology can improve fire support hardware, it can also greatly improve how militaries prioritise, locate and engage targets according to their military worth. This process, known as targeting,[23] is a joint function performed by teams of highly skilled multidisciplinary specialists using a variety of technical systems. The Army’s recent formation of 10 Brigade and its acquisition of a HIMARS-based long-range strike capability will see it become increasingly involved with targeting.[24] As the Army develops the skill set to conduct this function, it must also consider how 4IR technologies can help generate a competitive edge to its targeting capability. There are presently three primary areas where the Army could leverage 4IR technologies to enhance its support joint targeting: situational awareness, kill-chain responsiveness, and strike coordination.

Situational Awareness. One of the greatest challenges to personnel involved in the targeting cycle is to maintain the situational awareness necessary to detect targets. Historically this has been achieved through electronic, acoustic, seismic and visual detection methods, the fidelity of which is inevitably limited by range and meteorological conditions.[25] 4IR technologies have revolutionised wide-area surveillance by transcending traditional sensor limitations. For instance, the US has created AI-generated maps that instantaneously track environmental events, like bushfires and climate shifts, on a global scale.[26] Better still, the Ukrainian Armed Forces are spearheading the use of autonomous software that simultaneously fuses feeds from drones, social media and intelligence into a single multi-layered picture of the battlefield.[27] Additionally, within the next 12 months, the US will transition its ground moving target indicator capability from aircraft to AI-enabled satellites. This development will offer significant improvements in how ground forces are able to be spotted and tracked all over the world.[28] These examples demonstrate how 4IR technologies will transform battlefield situational awareness, greatly aiding decision superiority.[29]

The Army’s ability to harness such technologies will require high levels of interoperability with the joint intelligence community. Based on realistic combat-oriented training, resilient liaison networks must be forged by the Army with the Australian Geospatial-Intelligence Organisation and the Australian Signals Directorate. The Army will also need to effectively integrate semi-autonomous programming, machine learning and deep learning into its future battle management systems.

Kill-Chain Responsiveness. Another key challenge entailed in targeting is the time taken to progress from initial target detection through to a post-strike assessment, otherwise known as the kill chain.[30] Augmented intelligence programs can accelerate certain aspects of the kill chain to reduce the total closure time. For instance, the Tactical Intelligence Targeting Access Node program can allow tactical nodes to aggregate vast quantities of raw data from secure and open-source media to identify targets for potential engagement from across a battlespace.[31] Once targets are found, other programs such as Watchbox can then process, exploit and disseminate (PED) targets to engagement decision-makers.[32] Following a strike, the process of battle damage assessment (BDA) can be expedited using change detection software that autonomously senses variations on the earth’s surface, with convolutional neural networks then processing inputs from cyber, visual and electromagnetic sensors (such as satellites) to provide a summary of the effects delivered across wide areas.[33] Finally, machine learning programs can catalogue the effectiveness of different weapon combinations, as they are used during tactical engagements, to shape subsequent targeting priorities.[34] While these technologies are still in their infancy, Australia should be swift to seize upon their potential as they mature.

Strike Coordination. Finally, AI can greatly improve how the Army coordinates its long-range strikes with the joint force. This can be achieved using autonomous AI and machine-learning programs that digitally integrate numerous joint force firing systems into a single shared network to increase the number of potential kill-chain pathways. One such program is the Fires Synchronisation to Optimise Response in Multi-Domain Operations (FIRESTORM) in use by the US Army. The FIRESTORM program ingests data from numerous sensors and friendly units to rapidly produce strike recommendations to decision-makers.[35] During testing, FIRESTORM successfully ingested a sensor feed, conducted target recognition, updated the digital common operating picture, and produced a strike recommendation, all in 32 seconds.[36] Another US program, Joint All-Domain Command and Control (JADC2), seeks to harness AI and machine learning to ‘extract, consolidate and process only the relevant data and information’ from a vast array of joint force sensors and information sources.[37] Programs such as these will become increasingly critical to the Army as it seeks to synchronise its newly acquired long-range strike systems with the ADF’s joint missile fleet, which will soon include the Naval Strike Missile and Tomahawk.[38] This will ultimately enhance Army’s ability to contribute to joint force kill chains, which in turn may reduce risk to the ADF’s more vulnerable naval and air strike platforms.

Challenges to Adoption

Notwithstanding its inherent battlefield value, the integration of 4IR technologies into the Australian Army’s fires capability will pose numerous challenges. These include integration of the technology within existing fires and targeting capabilities, implementation of appropriate control measures on autonomous weapons and kill chains, and the risk of overdependence. Given the likely difficulties in resolving these matters, 4IR technologies should not be regarded as a panacea for all of the military challenges that the Army may face in the 21st century. Rather, new technologies offer the potential to augment conventional fires and targeting capabilities.

Integration. The introduction of completely autonomous technologies into military applications is a recent phenomenon, making the risk–reward ratio yet to be determined. Because of this, an arms race is currently being waged between prominent military powers for superiority in 4IR-enabled weapons and software.[39] Global powers such as China and Russia favour fully autonomous systems.[40] Unbounded by human control, such systems have the potential to support the development of the most superior weapons. By contrast, the US and its Western allies generally favour semi-autonomous systems, where human control is retained over every engagement.[41] This preference is driven largely by concerns regarding the potential indiscriminate effects of automated weapon systems. While bound by such ethical concerns, many Western nations are nevertheless concerned that they may fall behind in the emerging global arms race. As a result, they remain reticent to join international weapons agreements that require a ‘human in the loop’.[42] The climate of competition that exists around the acquisition and use of autonomous technologies will make it difficult for Australia to determine how far it should automate its own fires and targeting capabilities. Decisions will become even more challenging as newer and more potent 4IR-enabled applications are discovered. Ultimately Australia must balance its ethical obligations with the need to field a fighting advantage.

Control Measures. Divesting aspects of battlefield decision-making to an algorithm introduces the risk that an autonomous program might deviate from acceptable rules, such as the laws of armed conflict. This risk is particularly relevant to LAWs as they use machine learning to designate certain objects as threats.[43] It is particularly challenging to generate input control measures that can reliably ensure that a target is not misidentified. This is because the dynamic battlefield conditions in which LAWs must discern valid targets are theoretically infinite.[44] The International Red Cross (ICRC) seeks to address this challenge by recommending that lawful autonomous engagements are confined to specific target types within defined collateral damage parameters.[45] While many Western nations agree with the ICRC’s recommendations, no international consensus exists. It is reasonable to expect that this lack of general agreement will cause delays and complications in weapons design and subsequent adoption by Western nations. Indeed, war ethics theoretician Ross Bellaby highlights this problem as one of the fundamental challenges obstructing the adoption of autonomous weapons.[46]

In addition to the implementation of controls around battlefield decision-making, there is a need for control measures to be applied to automated targeting programs. For example, a key risk relates to the reliability of data on which automated targeting programs are based. Data is the ammunition of automated kill chains, and the quality of data the program ingests will determine how well it performs. In testing, machine-learning programs have been found to underperform in situations where intelligence data is scant and enemy misinformation designed to fool the algorithm is active.[47] This is why many Western nations maintain that meaningful human control over automated weapons is an essential control measure.[48] In any effort to introduce automated targeting, Australia will need the capacity to implement adequate control measures in circumstances in which the relevant technology is rapidly evolving and has not yet been fully tested and evaluated.

The Risk of Overdependence. The use of 4IR technologies will only deliver a battlefield advantage if it is presided over by human adaptability and judgement. To guard against overdependence on 4IR technologies, the Army must focus on building effective human–machine teams, not machine-centric teams.[49] Indeed, the Army must consider 4IR technologies as augmentations to a predominantly human-centric kill chain. To do otherwise is to invite disaster for several reasons. For example, potential adversaries, like China, employ systems destruction warfare (SDW) techniques designed to target cyber and information systems, making automated software a likely target.[50] The Army must therefore retain the capacity to fall back on rudimentary methods of targeting. Guarding against overdependence is also critical from a moral standpoint. Only humans can steward the moral dimension that enshrines the ethical conduct of war. An algorithm cannot fully account for the ‘atmosphere of war’, defined by Clausewitz as being characterised by ‘danger, physical exertion, intelligence, and friction’.[51] Tim MacFarland, a leading academic on the ethics of autonomous weapons, rightly posits that ‘trust’ is an unfit substitute for ‘control’ in the application of autonomous weapons.[52]

Conclusion

The Army’s indirect fires and targeting capabilities stand to greatly benefit from 4IR technologies. As it readies for LSCO, autonomous weapons such as loitering drones can be an inexpensive supplement to the Army’s artillery deficiency. At the same time, the integration of LAWs will prove an important firepower supplement that will allow the Army’s howitzers to concentrate on close combat, and its HIMARS to focus on strategic deterrence. Autonomous artillery is also more survivable than conventional systems when faced with counter-battery threats. In addition to hardware, 4IR technologies can greatly aid how the Army contributes to joint targeting. Augmented intelligence programs are transforming ground force situational awareness to significantly improve sensor fidelity and persistence. Automated kill-chain software can also expedite kill-chain responsiveness by accelerating processes such as PED and BDA. Finally, automated machine-learning programs can help the Army to better integrate its long-range precision fires into wider joint force kill chains. Integrating these advancements will, however, be challenged by the need for Australia to define an autonomous weapon strategy that aligns with its ethical standards. The Army will also be confronted by the need to enforce control measures that mitigate collateral damage risks. Finally, the Army will have to guard against overdependence on programs that are both vulnerable to SDW and not yet advanced enough to navigate the complexities of war. Regardless of the challenges ahead, the Australian Army’s ability to deliver ranged lethality in a future high-intensity conflict will be heavily influenced by how well it can achieve a measured but timely adoption of 4IR-enabled technologies.

About the Author

Major Jason Kirkham is a Battery Commander serving in the 4th Regiment, Royal Australian Artillery. His major appointments include the roles of Instructor-in-Gunnery at the School of Artillery and of Brigade Fire Support Officer, and as a targeting officer in the HQ 1st Division Joint Fires and Effects Coordination Centre. He holds a Masters in War Studies through UNSW, and enjoys practising, reading about and writing about joint fires and the profession of arms.

Endnotes


[1] TX Hammes, ‘Technological Change and the Fourth Industrial Revolution’, in George P Shultz et al. (eds), Beyond Disruption: Technology’s Challenge to Governance (Stanford: Hoover Institution, 2008), p. 40; Jean-Marc Rickli, ‘The Strategic Implications of Artificial Intelligence for International Security’, in Al Naqvi and Mark Munoz (eds), Handbook of AI and Robotic Process Automation (London: Cambridge University Press, 2022), p. 46.

[2] Amir Husain, ‘AI Is Shaping the Future of War’, PRISM 9, no. 3 (2021): 53.

[3] Department of Defence, Defence Strategic Review (Canberra: Commonwealth of Australia, 2023), p. 17.

[4] David Horner, The Gunners: A History of Australian Artillery (St Leonards: Allen & Unwin, 1995), p. 185.

[5] Ibid., p. 406.

[6] Ibid., p. 406.

[7] Gerhard Weinberg, A World at Arms: A Global History of World War II (Chapel Hill: University of North Carolina, 2005), p. 657.

[8] Aditya Bhan, ‘The Ukraine War’s Impact on Western Arms Production’, Observer Research Foundation, 27 December 2022, at: https://www.orfonline.org/expert-speak/the-ukraine-wars-impact-on-western-arms-production (accessed 28 March 2024).

[9] Mia Jankowicz, ‘Ukraine Is Fielding New $3,500 “Cardboard” Drones against Russia’, Business Insider, at: https://www.businessinsider.com/ukraine-is-using-a-cheap-flat-pack-cardboard-drone-australia-2023-8 (accessed 1 January 2024).

[10]  Unmanned Systems-Air, DefendTex Unmanned Aerial Vehicles, at: https://www.defendtex.com/uav/ (accessed 07 Aug 2024)

[11] Julian Kerr, ‘Making the M777 More Lethal’, Australian Defence Magazine, at: https://www.australiandefence.com.au/defence/land/making-the-m777-more-lethal (accessed 10 January 2024).

[12] Frank Verrastro and Andrew Stanley, ‘Attack on Saudi Oil Infrastructure: We May Have Dodged a Bullet, at Least for Now’, Center for Strategic and International Studies, September 2019, at: https://www.csis.org/analysis/attack-saudi-oil-infrastructure-we-may-have-dodged-bullet-least-now (accessed 28 December 2023).

[13] Sean Harper, ‘Organic Precision Fires for Marine Infantry’, U.S. Naval Institute, June 2022, at: https://www.usni.org/magazines/proceedings/2022/jhuarperne/organic-precision-fires-marine-infantry (accessed 30 March 2024).

[14] Department of Defence, Defence Strategic Review, p. 19.

[15] Ling Chen and Miles Evers, ‘Wars without Gun Smoke’, International Security, vol. 48 (2023), 201.

[16] Lisa Parks and Caren Kaplan, Life in the Age of Drone Warfare (North Carolina: Duke University Press, 2017), p. 270; Brennan Deveraux, ‘Loitering Munitions Are the Future of Division Shaping Operations’, Real Clear Defence, March 2023, at https://www.realcleardefense.com/2023/03/06/loitering_munitions_are_the_future_of_shaping_operations_885485.html (accessed 20 September 2023).

[17] Zhirayr Amirkhanyan, ‘A Failure to Innovate: The Second Nagorno-Karabakh War’, Parameters 52 (2022): 122; Justin Bronk with Nick Reynolds and Jack Watling, The Russian Air War and Ukrainian Requirements for Air Defence (London: RUSI, 2022), pp. 4–5.

[18] ‘The RCH 155 Is a Combination of Automated Artillery Firepower and Protected Wheeled Mobility’, KMW+NEXTER Defense Systems (KNDS), at: https://www.knds.de/en/systems-products/wheeled-vehicles/artillery/rch-155 (accessed 26 March 2024); Robert Dougherty, ‘Automated Artillery Achieves World First Fire-while-Moving Capability’, Defence Connect, 26 September 2023, at: https://www.defenceconnect.com.au/land/12856-automated-artillery-achieves-world-first-fire-while-moving-capability (accessed 4 January 2024).

[19] Sam Cranny-Evans, ‘Russia’s Artillery War in Ukraine: Challenges and Innovations’, Royal United Services Institute, 9 August 2023, at: https://www.rusi.org/explore-our-research/publications/commentary/russias-artillery-war-ukraine-challenges-and-innovations (accessed 27 March 2024).

[20] Mykhaylo Zabrodskyi, Jack Watling, Oleksandr V Danylyuk and Nick Reynolds, Preliminary Lessons in Conventional Warfighting from Russia’s Invasion of Ukraine: Feb–Jul 2022 (London: RUSI, 2022), p. 39.

[21] Kadir Alpaslan Demir, ‘Killer Robots and Armed Forces Transformation for the Robotic Era’, Defence and Strategy 21 (2021): 12.

[22]Jack Watling, ‘Automation Does Not Lead to Learner Land Forces’, War on the Rocks, 7 February 2024, at: https://warontherocks.com/2024/02/automation-does-not-lead-to-leaner-land-forces (accessed 20 March 2024).

[23] Department of the Army, Army Targeting, Field Manual 3-60 (Washington: Army Publishing Directorate, 2023), p. 1.

[24] Albert Palazzo, Resetting the Australian Army: Negotiating the 2023 Defence Strategic Review, Australian Army Research Centre Occasional Paper No. 16 (Commonwealth of Australia, 2023), p. 10.

[25] Alan Smith, Do unto Others: Counter Bombardment in Australia’s Military Campaigns (Newport: Big Sky Publishing, 2011), chapter 21, paragraph 77.

[26] Jeanne Chircop, ‘AI Revolutionizes Mapping Updates and Accuracy’, National Geospatial-Intelligence Agency, at: https://www.nga.mil/news/AI_Revolutionizes_Mapping_Updates_and_Accuracy.html (accessed 21 December 2023),

[27] Olga Tokariuk, ‘Ukraine’s Secret Weapon—Artificial Intelligence’, CEPA, 20 November 2023, at: https://cepa.org/article/ukraines-secret-weapon-artificial-intelligence (accessed 2 January 2024).

[28] Courtney Albon, ‘Spy Agency to Prototype Ground Moving-Target Tracking from Space’, C4ISR Net, 19 April 2023, at: https://www.c4isrnet.com/battlefield-tech/space/2023/04/19/spy-agency-to-prototype-ground-moving-target-tracking-from-space (accessed 5 January 2024). 

[29] Roy Lindelauf, Herwin Meerveld and Marie Postma, ‘Leveraging Decision Support in the Russo-Ukrainian War: The Role of Artificial Intelligence’, Atlantisch Perspectief 47, no. 1 (2023): 38.

[30] Jennifer Rooke, ‘Shortening the Kill Chain with Artificial Intelligence’, AutoNorms, 28 November 2021, at: https://www.autonorms.eu/shortening-the-kill-chain-with-artificial-intelligence (accessed 7 January 2024).

[31] Doug Graham, ‘Army Developing Faster, Improved Data “Kill Chain” for Lethal and Non-lethal Fires’, U.S. Army website, 22 April 2023, at: https://www.army.mil/article/263145/army_developing_faster_improved_data_kill_chain_for_lethal_and_non_lethal_fires (accessed 1 October 2023).

[32] R Ridger, ‘Winning the Counterland Battle by Enabling Sensor-to-Shooter Automation’, Air Land Sea Application Center, 1 November 2021, at: https://www.alsa.mil/News/Article/2822476/winning-the-counterland-battle-by-enabling-sensor-to-shooter-automation (accessed 1 December 2021).

[33] Ioannis Kotaridis and Georgios Benekos, ‘Integrating Earth Observation IMINT with OSINT Data to Create Added-Value Multisource Intelligence Information: A Case Study of the Ukraine–Russia War’, Security and Defence Quarterly 43, no. 3 (2023): 13; Michael O’Gara, ‘AI and Integrated Fires’, in SJ Tangredi and G Galdorisi (eds), AI at War: How Big Data, Artificial Intelligence, and Machine Learning Are Changing Naval Warfare (Naval Institute Press, 2021), p. 225; Tyler Knight, ‘Machine Learning to Detect Battle Damage Using Satellites’, Defense Systems Information Analysis Center, October 2022 at: https://dsiac.org/technical-inquiries/notable/machine-learning-tools-to-detect-battle-damage-using-satellite-images (accessed 1 October 2023).

[34] Rajesh Uppal, ‘US Army Doctrine Thrust on Multi-domain Operations Employing AI to Counter A2AD’, International Defense Security & Technology, 15 May 2023, at: https://idstch.com/military/army/us-army-doctrine-thrust-on-multi-domain-operations-employing-ai-to-counter-a2-ad (accessed 21 December 2023).

[35] Graham, ‘Army Developing Faster, Improved Data “Kill Chain” for Lethal and Non-lethal Fires’. 

[36] Uppal, ‘US Army Doctrine Thrust on Multi-domain Operations Employing AI to Counter A2AD’.

[37] Chip Downing, ‘JADC2: Accelerating the OODA Loop with AI and Autonomy’, RTI website, 29 August 2023, at: https://www.rti.com/blog/jadc2-the-ooda-loop#:~:text=JADC2%20capabilities%20will%20leverage%20Artificial,information%20from%20the%20sensing%20infrastructure (accessed 20 March 2024).

[38] Nigel Pittaway, ‘Strike Further and Harder—Australia’s Precision Strike Capability’, Australian Defence Magazine 31, no. 4 (2023): 26–30.

[39] Edward Moore Geist, ‘It’s Already Too Late to Stop the AI Arms Race—We Must Manage It Instead’, Bulletin of Atomic Scientists 72, no. 5 (2016): 319.

[40] Forrest Morgan et al., Military Applications of Artificial Intelligence (Santa Monica CA: RAND Corporation, 2020), p. 74

[41] David Vergun, ‘U.S. Endorses Responsible AI Measures for Global Militaries’, U.S. Department of Defense website, at: https://www.defense.gov/News/News-Stories/Article/Article/3597093/us-endorses-responsible-ai-measures-for-global-militaries (accessed 10 January 2024).

[42] Tania Rabesandratana, ‘Europe Moves to Compete in Global AI Arms Race’, Science 260, no. 6388 (2018): 474

[43] Paul Scharre, Autonomous Weapons and Operational Risk: Ethical Autonomy Project (Washington: Center for a New American Security, 2016), p. 2.0

[44] Martin Hagström, ‘Military Applications of Machine Learning and Autonomous Systems’, in Vincent Boulanin (ed.), The Impact of Artificial Intelligence on Strategic Stability and Nuclear Risk: Volume I Euro-Atlantic Perspectives (Stockholm: Stockholm International Peace Research Institute, 2019), p. 37.

[45] Vincent Boulanin, Neil Davison, Netta Goussac and Moa Peldán Carlsson, Limits on Autonomy in Weapon Systems (Stockholm: Stockholm International Peace Research Institute, 2020), p. 37.

[46] Ross Bellaby, ‘Can AI Weapons Make Ethical Decisions?’, Criminal Justice Ethics 40 (2020): 92.

[47] Peter Layton, Fighting Artificial Intelligence Battles, Joint Studies Paper Series No. 4 (Canberra: Centre for Defence Research, 2021), p. 14.

[48] Angel Gomez de Agreda, ‘Ethics of Autonomous Weapons Systems and Its Applicability to Any AI Systems’, Telecommunications Policy 44 (2020): 9.

[49]Alex Neads, David Galbreath and Theo Farrell, From Tools to Teammates: Human-Machine Teaming and the Future of Command and Control in the Australian Army, Australian Army Occasional Paper No. 7 (Canberra: Australian Army Research Centre, 2021), p. 14.

[50] Jeffrey Engstrom, Chinese Confrontation and Systems Destruction Warfare (Santa Monica CA: RAND Corporation, 2018), p. 13.

[51] Carl von Clausewitz, On War, ed. and trans. Michael Howard and Peter Paret (Princeton: Princeton University Press, 1984), p. 122.

[52] Tim McFarland, ‘Reconciling Trust and Control in the Military Use of Artificial Intelligence’, International Journal of Law and Information Technology 30 (2023): 472–483, 473.