Skip to main content

Detecting Information Warfare activities

Detecting Information Warfare activities

The conduct of Information Warfare (IW)[1] and operations in the Information Environment (IE)[2] are not new phenomena. Arguably the concepts, contemporary lexicon aside, are as old as conflict itself. IW operations are those that are designed to have a cognitive effect, resulting in the successful imposition of one’s will. IW includes both words and actions; the importance lies in how the words or actions are processed, perceived or interpreted by the target audience, be they friendly, neutral or enemy.

As noted, the use of information to degrade, undermine, discredit, mislead or dupe is not a new occurrence. However, as a direct result of readily available technology, the extent to which information can have an effect on audiences, the intended target or otherwise, is significant. In the extreme, the rules-based global order and trust in democratic processes can be undermined. If in doubt consider the Ukraine conflict, or Russian interference in the US elections.

IW influence activities interfere in public debate, make use of all available influence techniques and channels, exploit perceived societal vulnerabilities, and mimic legitimate channels such as journalism, public affairs, and trusted social media platforms. The Swedish Civil Contingencies Agency has identified six widely used information influence techniques: social and cognitive hacking (echo chambers), deceptive identities (shilling), technical exploitation (deep fakes), disinformation (fabrication, manipulation), malign rhetoric (straw man), and symbolic actions (leaking).

Detecting IW influence activities is not easy. Sometimes false information is used simply to cause confusion and chaos, by being injected randomly into conversations. At other times, the influence activity is part of a sophisticated and over-arching IW campaign, as has been conducted by countries such as China and Russia.

Three questions can be asked to assist in detecting false information: is the information designed to be deceptive, is it intended to cause harm, and how disruptive is the information. To answer these questions, it is important to first understand the strategic narrative and the target audience.

A threat force IW influence activity that is part of a broader IW campaign will have its own strategic narrative and/or will harness the narratives of the target audience, manipulating it as required. Every society and sub-element of society has its own unique narrative; stories used to understand or explain events. The target audience can be general society, groups within society or individuals within society. It is useful to understand which group the strategic narrative is targeting in order to assess likely impact and possible originator, as well as the purpose of the influence activity. Not an easy task, but a necessary start point.

In an IW influence activity, one of the most efficient ways to use misleading or inaccurate information to influence a decision is through harnessing the known biases of the target audience, exploiting their cognitive vulnerabilities. Biases often have a negative connotation, but the reality is all humans have biases which are a result of heuristics, and which can lead to both good and bad decision making.

The future Land Force can be confident that threat forces will have made an effort to understand its biases—as an organisation—as a sub-set of society and as individuals. To be in a position to answer the three aforementioned questions, the Land Force must have a level of self-awareness about inherent biases and how they can be manipulated. There are many biases that effect the way we make decisions. Some common biases include availabilityanchorover-confidence, confirmation and bias blind-spot.

Consider the examples from the 2017 US intelligence report on the Russian influence activity relating to the US presidential election. The report found the interference included the use of bots, media-based disinformation, third-party intermediaries and trolls. The campaign used conflicting strategic narratives to exploit the existing social divides in America. The strategic narratives, aimed at both discrediting one candidate while strengthening the position of the Russian preferred candidate, exploited many of the most common biases. For example, anecdotally the bias blind-spot resulted in individuals on both sides refusing to believe the manipulation was occurring, while confirmation bias saw individuals seizing the side of the narrative they believed to be true. While it is easy to judge with the benefit of hindsight, over-confidence and bias-blind spot biases are common for a reason, and any member of the Land Force is susceptible to similar manipulation.

If these IW influence activities were easy to detect, they would not be so successful.

[1] IW is defined here as Information Activities (IA), Cyber and Electro Magnetic Activities (CEMA), and operations in the space domain.

[2] IE is defined here as the aggregate of individuals, organisations, or systems that collect, process, or disseminate information.

The views expressed in this article and subsequent comments are those of the author(s) and do not necessarily reflect the official policy or position of the Australian Army, the Department of Defence or the Australian Government.

Using the Contribute page you can either submit an article in response to this or register/login to make comments.