Skip to main content

Manoeuvreist Doctrine in the Age of Autonomy

Single quad rotor drone in centre of image with sky background. Image by andri333 from Pixabay.


The Ukraine war has thrust small drones into the spotlight as an integral component of contemporary combined arms warfare. Cheap, plentiful, and attritable commercial off-the-shelf (COTS) drones (sometimes referred to as ‘wedding drones’ with equal parts respect and derision)[i] have proven their utility for military roles such as dropping explosive ordnance onto targets and performing intelligence, surveillance, and reconnaissance (ISR) activities.[ii] Both of these capability effects pose a threat to forces in manoeuvre and have a profound effect on how commanders must implement manoeuvre doctrine. To meet this emerging threat, the armed forces of free societies must recognise the nature of disruptive drone technology and develop and deploy necessary countermeasures.[iii] With greater awareness of emerging technology presented by COTS drones, armed forces also may discover techniques to use them in ways that achieve devastating, asymmetric effects on adversaries accustomed to conventional doctrine.[iv]

The computer processing power available on COTS drone hardware is rapidly increasing. New, massively parallel computing hardware enables drones to achieve AI inference for interpreting visual data rather than just sensing it.[v] With this capacity, platforms will increasingly take over command and control (C2) activities[vi] such as autonomously updating warfighters’ common operational picture and directing loitering munitions to targets.[vii] The combined effect of these autonomous technologies may allow an adversary to significantly disrupt the successful conduct of combined arms manoeuvre, which traditionally requires massed infantry and armour formations to strike at a single point to achieve breakthrough. Such manoeuvre may no longer be possible, at least while an adversary’s COTS drones may guide effectors such as loitering munitions or indirect fire artillery onto such formations[viii]. Commanders must consider the doctrinal implications of both current and evolving COTS drone technologies and procure countermeasures which may preserve their ability to conduct manoeuvre.[ix]

Technical Description of COTS drones

Despite their civilian origin, COTS airframes and their included electronics bring unprecedented capabilities to the warfighter acting at the tactical level.[x] Drones such as those made by DJI are cheap, easy to operate with little training, and come equipped with a comprehensive suite of sensors and hardware not dissimilar to those in consumer smartphones. Use of such drones has practically entered the doctrine of both Ukrainian and Russian combatants.[xi]

The basic sensor suite a COTS drone contains include GPS, a barometer pressure sensor for improving altitude accuracy, a magnetometer for determining compass heading, and a camera gimbal.[xii] Advanced models (such as those made by DJI, Skydio, Parrot, and Autel Robotics) now frequently possess a highly-capable optical and digital zoom camera.[xiii] In addition, such advanced models often carry night vision or thermal cameras.[xiv]

Case Study from the Russo-Ukrainian War

While advanced sensors enhance the overall effect of a drone on the battlefield, even basic models can pose a serious threat to combined arms forces massed together for manoeuvre. In May of 2022, a Russian armored column attempted a crossing of the Siverskyi Donets river.[xv] The bid proved disastrous; the Institute for the Study of War estimates 485 out of 550 soldiers who took part in the crossing were killed or wounded, and the attempt resulted in a loss of more than 80 Russian vehicles.[xvi]  A Ukrainian combat engineer claims on Twitter (now named X) to have directed artillery fire onto this Russian armoured column using a drone model that is a budget-conscious alternative to the DJI Mavic series possessing minimal features. Specifically, he maintains that he used the drone to confirm the presence or absence of targets at pre-calculated positions of the strategic river crossing. The nature of the operating theatre significantly limited the search space within which the drone needed to designate targets. If the claimant is to be believed, indirect fire artillery teams were prepared to devastate any such pre-determined river crossing location.[xvii]

It is easy to underestimate the devastating effects of precision indirect fire. The statistics from both historical and contemporary high-intensity conflict however are harrowing. A 1962 report from the Medical Department of the US Army shows that artillery and mortar fire accounted for 65 per cent of the total battlefield casualties in the European and Mediterranean theaters during WWII.[xviii] In the current Russo-Ukrainian war, a paper in the Journal of the American College of Surgeons demonstrates that more than 70 per cent of injuries in Ukraine’s armed forces have been sustained from artillery and rocket barrages.[xix] Given the empirically out-sized impact of artillery and other indirect fire on battlefield casualties, military commanders should consider carefully the novel use of COTS drones to direct such effects with more precision.

Regarding the attempted Russian crossing of the Siverskyi Donets river: what if this devastating effect of a common COTS drone was possible more generally? What if it could be conducted on any kind of terrain, at any strategic situation? What would the implications be for manoeuvre doctrine?

As previously described, even basic COTS drones possess multiple sensors which let them know where they are and what direction they are oriented towards with high accuracy. A key limitation however, is that they traditionally have no way of determining the coordinates of whateverthey are looking at on the ground. [xx] This limitation, however, is entirely solvable with the right software.[xxi]

The Ukrainian army has created and adopted software systems which solve this challenge and put basic COTS drones at the forefront of the kill chain. Locally-developed software called Кропива (en: Kropyva) that runs on Android tablets allow soldiers to mark targets on a map and to distribute their locations to artillery units in real time,[xxii] often over a Starlink Internet connection.[xxiii] This system, along with another called GIS Arta, have been described as the ‘Uber for Artillery’, bringing the processing and efficiency advantages of algorithms to practicing warfare.[xxiv] The particular software which integrates COTS drones into this system is an add-on for Кропива called FireFly. A demo on YouTube from Ukrainian developer Dima Kovalenko shows the centre of the viewfinder of a DJI drone in flight being used to instantly designate and mark a target on Кропива.[xxv] The capacity to instantly designate a target from a drone has resulted in the Siverskyi Donets River crossing scenario being replicated countless times all over Ukraine. Russian formations massed for manoeuvre are just as vulnerable to artillery in the fields, the hills and valleys – indeed anywhere in the landscape – as they are at a narrow chokepoint by the river.[xxvi]

Description of the Novel Terrain-raycast Technique

The technique that allows a COTS drone to be used for target designation in the manner described above is called terrain-raycasting. This technique allows even basic COTS drones to designate targets using just their passive sensors and public terrain data.[xxvii] In 2000, the US National and Geospatial Intelligence agency conducted a mission called the Shuttle Radar Topography Mission (SRTM) which used a space-borne C-band Synthetic Aperture Radar instrument to determine the height of almost all terrain on Earth at a resolution of about 30 metres.[xxviii] This terrain data can be used as a kind of ‘digital dart board’, where a directed ray is simulated digitally from the position and orientation of a drone’s camera and is then cast towards this terrain data until the ray intersects it. The point of intersection is the designated target’s location.[xxix]

Data processing improvements have allowed for a ray to be simulated accurately for any pixel within a drone camera’s image (not just the centre) using camera intrinsics modeling.[xxx] Currently, this technology allows a software operator to designate a target anywhere within a drone image with just a tap of the finger, and to share the calculated location to a unified battle management network just as expediently.[xxxi]

What Comes Next

Soon, COTS drones will contain the computer processing power necessary to identify and locate objects of interest (such as tanks, artillery, and anti-air defenses) without the assistance of a human operator.[xxxii] This will substantially shorten the time required for each cycle of the ‘observe, orient, decide, and act’ (OODA) loop which is critical to outcomes on the battlefield.[xxxiii] Object detection (a subset of the computer vision discipline) has improved considerably with the advent of machine learning and massively-parallel computing hardware. Frameworks such as You Only Look Once (YOLO) allow accurate object detection neural nets to be trained on inexpensive computer hardware such as consumer NVIDIA GPUs (massively-parallel PC hardware originally intended for gaming).[xxxiv],[xxxv] Once trained, these neural nets can be deployed for object detection inference on much less capable computer hardware, including some increasingly being included in drones.[xxxvi],[xxxvii] 

When discussing a computer processor’s performance for AI object detection, the most useful metric of its performance is the number of floating point (how decimal numbers are represented in a computer) math operations it can perform per second (AKA ‘FLOPS’). This metric is closely tied to AI performance because floating point math is the basic operation in the evaluation of  weights and biases of an artificial neural network from a given input.[xxxviii] Historically, computer processors have contained only a handful of  individual processors, limiting overall throughput.[xxxix] A new class of processors (which are massively-parallel) are proliferating that contain many hundreds or thousands of simple processors that can perform math operations in parallel. This makes them significantly faster at AI inference for object detection within images.[xl],[xli],[xlii],[xliii] For example, the 13th generation Intel i3 processor on the early 2020 Apple Macbook Air, may perform only 0.343 trillion floating point operations per second (TFLOPS) with 4 cores.[xliv],[xlv] An example of the latter, the consumer Skydio 2 drone contains an NVIDIA Jetson TX2 compute module capable of 1.33 trillion floating point operations per second (TFLOPs) with 258 cores,[xlvi],[xlvii] all while requiring significantly less weight and power.  A state of the art YOLO AI object detection model variant requires 0.843 TFLOPs of processing throughput, allowing objects of interest within an image to be located by this drone model roughly once every second.[xlviii] The upcoming Skydio X10 drone contains an NVIDIA Jetson Orin[xlix] capable of at least 20 TFLOPs of processing throughput with 518 cores. This capability allows objects to be detected in real-time, full motion video by this UAS.[l]

In 2023, Theta Informatics conducted a test of a custom YOLO object detection AI model trained to recognise the Russian T-72 Main Battle Tank. The test was performed on a dataset of 932 images never seen by the model during training. The results were astounding: it had a precision statistic of 97.2 per cent, indicating that false positives only occurred for 2.8 per cent of detected objects. Further, it had a recall statistic of 97 per cent, indicating only 3 per cent of objects were ever missed by the detector.[li]

The combination of automated object detection from COTS drones’ visual feed with terrain-raycasting geolocation will be a potent system for commanders of maneuver forces to contend with in the near future.[lii] Such systems will allow cooperative ‘swarming’ UAS’s to plan their own search areas, identify areas and objects of interest, and relay them to human operators for final review before being connected to a loitering munition or indirect fire shooter which may close the kill chain.[liii] Such capability would make any attempt at manoeuvre largely impossible without devastating consequences[liv].


For free societies to prepare against COTS drone technology, they should seek to understand it, integrate it into training exercises, and develop reliable and effective electronic and kinetic countermeasures that can deny an adversary’s ability to use such capabilities for reconnaissance and deployment of munitions. The Ukraine war has proven how COTS drones seamlessly integrate into and change the dynamics of contemporary combined arms warfare.[lv] Such technology is not so different in scope nor impact as the invention of barbed wire, the machine gun, and heavy artillery which disrupted pre-war conventional doctrine in the First World War.[lvi],[lvii] Commanders should consider the weight of such history when considering preparedness for the future. 


[ii]      Andrew E. Kramer and David Guttenfelder, "From the Workshop to the War: Creative Use of Drones Lifts Ukraine," The New York Times, August 11, 2022.

[iii]     Stephen Fidler, "The Conflict in Ukraine Offers Old—and New—Lessons in 21st-Century Warfare," The Wall Street Journal, February 23, 2023, .

[iv]     Jason Beaubien, "In the Russia-Ukraine War, Drones Are One of the Most Powerful Weapons," NPR, July 30, 2022.

[v]      Tim Vehling, "How Edge AI Advancements Will Drive the Next Generation of Drone Innovation," IoT For All, February 1, 2022.

[vii]    Alex Horton and Serhii Korolchuk, "In Ukraine, Explosive DIY Drones Give an Intimate View of Killing," The Washington Post, October 4, 2023.

[ix]     Shaan Shaikh, Tom Karako, and Michelle McLoughlin, "Countering Small Uncrewed Aerial Systems: Air Defense by and for the Joint Force,” CSIS Missile Defense Project, November 2023.

[x]      W.G. Dunlop, "All-out drone war in Ukraine points to future," Tech Xplore, January 13, 2023.

[xi]     Kerry Chávez, "Learning on the Fly: Drones in the Russian-Ukrainian War," Arms Control Association, January/February 2023.

[xii]    Flynt, Joseph. “What Sensors Do Drones Use?” 3D Insider. April 18, 2019.

[xiii]   Spires, Joshua. “The Best Consumer Drones of 2020 - DJI, Autel, Skydio.” DroneDJ, May 22, 2020.

[xiv]   Murison, Malek. “Comparing DJI's Thermal and Night Vision Drones” DJI Enterprise Insights, June 27, 2022.

[xvi]   Stepanenko, Kateryna, and Frederick W. Kagan. "Russian Offensive Campaign Assessment, May 14" Institute for the Study of War. May 14, 2022.

[xviii] Heaton, Leonard D., James Boyd Coates Jr., and James C. Beyer. 1962. "WOUND BALLISTICS" Office of the Surgeon General, Department of the Army, Washington, D.C. Accessed March 17, 2021.

[xix]   Epstein, Aaron, et al. "Putting Medical Boots on the Ground: Lessons from the War in Ukraine and Applications for Future Conflict with Near-Peer Adversaries" Journal of the American College of Surgeons 237, no. 2 (2023): 364-373.

[xx]    Flynt, Joseph. “What Sensors Do Drones Use?” 3D Insider. April 18, 2019.

[xxi]   Krupczak, Matthew. “Theta-Limited README” GitHub, Theta Limited, August 8, 2023.

[xxiii] Jones, Grace, Janet Egan and Eric Rosenbach. “Advancing in Adversity: Ukraine’s Battlefield Technologies and Lessons for the U.S.” Policy Brief, Belfer Center for Science and International Affairs, Harvard Kennedy School, July 31, 2023. 

[xxiv]  Bruno, Mark. "‘Uber For Artillery’ - What is Ukraine's GIS Arta System?" The Moloch, August 24, 2022.

[xxv]   Kovalenko, Dima. "Кропива + FireFly через WiFi" YouTube video, 3:53. July 15, 2022.

[xxvii]  Krupczak, Matthew. “Theta-Limited README” GitHub, Theta Limited, August 8, 2023.

[xxviii] EOS Project Science Office at NASA's Goddard Space Flight Center. "Shuttle Radar Topography Mission" Last modified October 23, 2019.

[xxix]  Krupczak, Matthew. “Theta-Limited README” GitHub, Theta Limited, August 8, 2023.

[xxx]   Zhang, Z. “A Flexible New Technique for Camera Calibration.” IEEE Transactions on Pattern Analysis and Machine Intelligence. Vol. 22, No. 11, 2000, pp. 1330–1334.

[xxxi]  Razumov, A.N., Kryukov, G.A., Kuznetsov, A.N. "I Live, I Fight, I Win: Rules of Life In War" Translated by Lethal Minds Journal, January 3, 2023.

[xxxii] Wu, Xin, Wei Li, Danfeng Hong, Ran Tao, and Qian Du. “Deep Learning for UAV-based Object Detection and Tracking: A SurveyIEEE Geoscience and Remote Sensing Magazine 10, no. 1 (2022): 91-124.

[xxxiii] Boyd, John R. "Destruction and Creation." U.S. Army Command and General Staff College, September 3, 1976. PDF file.

[xxxiv] Chen, Zhu, José Ángel González González, Rogelio Bustamante-Bello, and Carlos Francisco Moreno-García. 2021. "Object Detection, Distributed Cloud Computing and Parallelization" MDPI. Accessed November 24th, 2023.

[xxxv]  Wang, Chien-Yao, Alexey Bochkovskiy, and Hong-Yuan Mark Liao. "YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors" arXiv preprint arXiv:2207.02696 (2022).

[xxxvi]        Wu, Xin, Wei Li, Danfeng Hong, Ran Tao, and Qian Du. “Deep Learning for UAV-based Object Detection and Tracking: A Survey” IEEE Geoscience and Remote Sensing Magazine 10, no. 1 (2022): 91-124.

[xxxvii] Skydio. "X10 Technical Specifications" Accessed November 24, 2023.

[xxxviii]      "Nothing but AI. 'Benchmarking Deep Learning Operations Per Second (Flops): Unleashing the True Power' Nothing but AI. Accessed December 10, 2023.

[xxxix] Engineering CSP. "Unfolding the Story of Computer Processor Evolution" Engineering CSP. Accessed December 10, 2023.

[xl]     "Nothing but AI. 'Benchmarking Deep Learning Operations Per Second (Flops): Unleashing the True Power' Nothing but AI. Accessed December 10, 2023.

[xli]    Engineering CSP. "Unfolding the Story of Computer Processor Evolution" Engineering CSP. Accessed December 10, 2023.

[xlii]   Backlink Works. "The Evolution of PC CPUs: From Single Core to Multi-core Processors" Topics on SEO & Backlinks. Accessed December 10, 2023.

[xliii]  Wikipedia. "Multi-core processor" Wikipedia. Accessed December 10, 2023.

[xliv]  Casey, Henry T. "MacBook Air review (Intel, early 2020)" Tom's Guide, February 01, 2021. Accessed December 10, 2023.

[xlv]   GadgetVersus. 'Intel Core i3-10100 GFLOPS Performance' Processor performance Intel Core i3-10100 in the Geekbench 4 benchmarks platform with SGEMM. Accessed December 10, 2023.

[xlvi]  Skydio. "Skydio 2+ Enterprise" Accessed November 24, 2023.

[xlvii] NVIDIA. "Jetson TX2 for Next-Level Performance" Accessed November 24, 2023.

[xlviii] Wang, Chien-Yao, Alexey Bochkovskiy, and Hong-Yuan Mark Liao. "YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors." arXiv preprint arXiv:2207.02696 (2022).

[xlix]  Skydio. "X10 Technical Specifications" Accessed November 24, 2023.

[l]       NVIDIA. "Jetson Orin for Next-Gen Robotics" Accessed November 24, 2023.

[li]      Theta Informatics LLC. "Press Release – Naissance AI Model Prototype for Detection of the T-72 Main Battle Tank" November 25, 2023. Accessed November 25, 2023.

[lii]     Krupczak, Matthew. “Theta-Limited README” GitHub, Theta Limited, August 8, 2023.

[liii]    Crumley, Bruce. "Department of Defense to Pick Replicator Drones by Mid-December" DroneDJ, November 22, 2023.

[lv]     Stephen Fidler, "The Conflict in Ukraine Offers Old—and New—Lessons in 21st-Century Warfare" The Wall Street Journal, February 23, 2023.

[lvi]    "Military Technology in World War I" Library of Congress. Accessed November 24, 2023.

[lvii]   Storz, Dieter. “Artillery1914-1918-online International Encyclopedia of the First World War. Accessed November 24, 202.

The views expressed in this article and subsequent comments are those of the author(s) and do not necessarily reflect the official policy or position of the Australian Army, the Department of Defence or the Australian Government.

Using the Contribute page you can either submit an article in response to this or register/login to make comments.