Emerging Technologies – The Primal Strategic Challenge
To understand technological change, look for the enduring patterns and concepts
When Facebook was launched almost 15 years ago with a goal of connecting people, no one could have imagined that it would have become a tool for spreading hate speech or undermining elections. Yet this is precisely what happened. This was greatly amplified by the birth of the iPhone a few years later—with its encryption, portability, and the selection of downloadable social media ‘apps’. While these problems have been evident for some time, and have been extensively studied and debated, lasting solutions to them have remained elusive.
This example illustrates the primal challenge of dealing with technological change in the 21st century. The challenge arises from the absence of a reference point. How do we comprehend, contextualise, and conceptualise the changes wrought by emerging technologies, which are converging and being applied in completely unforeseen ways? The changes wrought by these technologies are best described as ‘discontinuous’; that is, not only does the past not offer any meaningful guidance for the future, the effects of a particular technology also cannot be predicted with confidence. This makes responding to the changes excessively difficult, at a time when the stakes for organisations could not be higher.
How to understand and respond to these issues is a critically important consideration for the Australian military. Indeed, its long term effectiveness depends critically on how well it does these. While the future is hard to predict, the example at the start of this article illustrates several ‘truisms’ associated with emerging technologies that can act as a reference point for military planners:
- Technological change is ‘labour-augmenting’ and ‘skills-biased’. Innovations like robotics, artificial intelligence, nanotechnology, big data, cloud computing, etc are transforming the industrial landscape and increasing the productive capacity of individuals, thus allowing fewer people to produce more output. Additionally, those with the skills to work the technologies are attracting higher salaries, while those with lower skills are seeing their wages stagnate or decline. The resulting situations of ‘jobless growth’ and income inequality are causing economic dislocation for millions of people, and have helped fuel economic nationalism and ethnic radicalism around the world. This is occurring while these technologies are still in their early stages. Their long-term implications—how they will evolve and be applied, what opportunities and challenges they will create, and how they will be controlled—cannot even be fully comprehended in the present time.
- Many modern technologies are highly ‘democratised’. They are diffused widely throughout the world. Their costs typically fall over time and can be accessed by anyone with the means to pay for them. Additionally, many new technologies reflect the ‘network economics’ model, where their value increases when more people adopt the same technology (think of Whatsapp). Due to these factors, the incentive to spread these technologies as widely as possible is very strong, and any advantage arising from possession of such technologies is likely to be transient at best.
- Problems created by technologies cannot be defined in technological terms alone, and have to cover socio-political, economic, environmental, and security dimensions as well. Solutions to problems depend critically on how the problems are themselves defined, and how widely these definitions are shared/accepted. This is a highly politicised process—how problems are defined depends on the agendas and incentives of those defining it. In other words, ‘where you stand depends on where you sit’, and if contemporary social problems teach us something, it is that people don’t always sit together. This was aptly illustrated by the controversy around Section 18C of the Racial Discrimination Act; its stated intent of curtailing hate speech invited criticisms of efforts to curtail free speech. On a social level, judgments regarding what is appropriate free speech remain highly subjective, even as the internet and communication technologies continue to expand and test the limits of what is considered appropriate.
- Technologies create ‘path dependence’. Path dependence simply means that history matters. It explains how decisions in a particular time are affected/limited by past decisions, even if those are no longer relevant. In other words, it creates ‘lock-in effects’, the most common illustrations of which result from the adoption of particular technologies or standards. These are then widely diffused and imitated over time, and result in associated skills training and processes, as well as physical and network infrastructure. Changing these later in response to changing circumstances may be rendered infeasible due to excessively high costs, shortage of alternative skills, and ‘version incompatibility’. Path dependence has particular salience in the area of military capability acquisitions, where future acquisitions may be driven less by considerations of effectiveness and cost, and more by whether they can ‘talk to’ existing systems. It becomes even more dangerous if it locks the military into outdated modes of thinking, and thus guarding against threats that no longer exist. As the celebrated economist, John Maynard Keynes said, “The difficulty lies not so much in developing new ideas as in escaping from old ones.”
- As a case between the FBI and Apple illustrates, society’s laws and norms often lag behind the technologies themselves, as does its understanding of the latter’s implications. Technologies that are exploited for violent purposes are often controlled by large corporations and protected by advanced encryption capability and intellectual property laws. These technologies, and the global brands that they contribute to, are highly valued assets that corporations assiduously create and aggressively defend. To this end, they can even defy security services and courts in their efforts to investigate, disrupt and prosecute violent activities that are facilitated by technology. If forced to cooperate—for example, by decrypting their technologies and sharing personal data of clients—corporations could respond by invoking not just relevant privacy laws, but also the fundamental property rights principle on which western capitalism is itself based. Security and defence agencies can push for new laws or new skills to respond to these problems, but that requires time, effort, and resources. In times of resource scarcities and hyper-partisan political discourses, this is easier said than done.
The foregoing discussion highlights some fundamental properties of emerging technologies. This creates a self-evident implication for military planning: don’t look for lasting advantage based on technology. In order to effectively deal with the opportunities and challenges wrought by technology, it is important to look beyond STEM, and into the humanities and social sciences (HASS) domain. Plans for the future should include provisions for periodic reviews and revisions, but given the highly contested process by which problems are defined, this is easier said than done. Military planners should be prepared to participate in these contests, which will require substantial resources and even skills diversification. Before acquiring technologies, consider how ‘future-proof’ they will be, and whether they will constrain future choices and decisions. Finally, the problem with technology is not that others will use it against us, but that our own societal institutions and values will hinder our timely responses when that does happen.
The views expressed in this article and subsequent comments are those of the author(s) and do not necessarily reflect the official policy or position of the Australian Army, the Department of Defence or the Australian Government.