Disinformation as a Means of Hybrid Warfare
With these words, Tedros Adhanom Ghebreyesus, Director-General of the World Health Organization (WHO), warned in February 2020. While efforts were being made around the world to contain the spread and health consequences of the virus, the pandemic was accompanied by a wave of false and misleading information. More recently, Russia has also made extensive use of disinformation campaigns as a tool in its war against Ukraine, both to maintain support for the war within Russia and to undermine the Ukrainian government and break international support for Ukraine. China also uses targeted disinformation campaigns, primarily to influence its own standing in the world.
As a result, dealing with disinformation has become more prominent on the political agenda and more widely perceived by the general public. In the European Union in particular, there is a growing awareness of the potential consequences of disinformation. Disinformation is perceived not only as a problem in the digital space, but also as a security challenge, as it is often used as a method of hybrid warfare. To better understand this, it is crucial to know more about disinformation campaigns: Targeted and deliberately disseminated disinformation represents planned campaigns in the form of an accumulation of information actions that promise a certain benefit, for example for strategic or political purposes. The aim of disinformation campaigns is to influence public and political opinion, manipulate emotions, create confusion, and sow mistrust. The aim is to create chaos, disunity and discord - not only in the digital sphere, but also in society itself. Furthermore, such campaigns aim to erode political certainties and dissolve a socially shared concept of truth. In some cases, the reinforcement of existing sentiments can lead to serious consequences, such as the use of violence. Disinformation campaigns are also used to disseminate government or partisan propaganda, attack the opposition, conduct smear campaigns, deflect or divert discussion or criticism from important issues, and suppress the participation of certain individuals through personal attacks or harassment. As such, it is a tool often used by authoritarian states. Disinformation campaigns can be short-term, such as to influence elections, or long-term, such as to undermine trust in media or institutions.
It is important to recognize that many of the issues at the heart of such campaigns - polarization, distrust of media or government - predate social media and even the Internet itself. The problems caused by disinformation campaigns are therefore linked to broader political, social, civic, or media issues. Similarly, the more divided a society is, the more vulnerable it is to disinformation campaigns. Therefore, disinformation campaigns do not create such differences, but rather exploit existing conflicts of opinion and tensions. Thus, a disinformation campaign often relies on a "rational core" of plausible, verifiable information or shared understandings that can be reshaped through disinformation. Elements of such campaigns can thus include the addition of fabricated details, the use of visual misinformation in the form of distorted images and videos, the use of rhetorical questions, but also the use of bots or a large number of purchased followers to convey a sentiment that does not reflect actual reality. In addition, the exploitation of emotions such as anger or fear is a classic tool. Crisis situations, which are characterized by significant confusion, are another special situation in this regard; disinformation can spread particularly quickly in such a context. In addition, trust in the media is declining worldwide. In addition, a growing proportion of the population uses social media as a source of news and political information. As a result, the dissemination of news is becoming increasingly independent of professional journalism. This makes Western democracies, and the institutions upon which they are based, particularly vulnerable to such attacks due to their pluralistic and open societies and the freedom of expression that comes with them.
In order to properly understand disinformation in the context of hybrid warfare, it is essential to explain hybrid warfare: Hybrid warfare describes the combination of different means, types, and strategies of waging war in order to achieve strategic political objectives. A wide range of means is used to exert influence: influence can be exerted both overtly and covertly, militarily as well as through civil, diplomatic, economic, propagandistic, and informal means. Accordingly, the traditional division into the classical categories of military and civilian is abolished. On the one hand, hybrid warfare can describe the conduct of an armed conflict with the "admixture" of other aspects; on the other hand, military acts of war or conflict are not even absolutely necessary. This leads to a blurring of the boundaries between war and peace. At the same time, the mixing of military and civilian means makes it impossible to draw a clear line between internal and external security. Likewise, the clear categories of "friend," "enemy," and "neutral" are dissolving. Accordingly, ambiguity is a characteristic element of hybrid warfare and is deliberately used as a weapon. Plausible deniability of responsibility for specific military actions, up to and including participation in the war as a whole, is a crucial factor in hybrid warfare - especially in cyberspace, this covering of tracks and denial of actions is easy.
The increasing combination and orchestrated use of means, as well as the increased relevance of civilian measures such as disinformation campaigns and the associated expansion of the battlefield, give hybrid warfare in the 21st century a new quality and make cyberspace the main domain of hybrid threats. Of course, disinformation campaigns are not a new phenomenon; they have always been an aspect of foreign policy and especially of warfare. Today, however, in the "information age" of the 21st century, disinformation exists in an unprecedented form. In particular, through social media such as Twitter and Facebook, disinformation can be easily put into the world and spread with great reach. Digital technologies have thus fundamentally changed access to and the form of information. As a result, the Internet and social media in particular are seen as "key game-changers in the weaponization of information.
A study by Lazer et al. found that in the months leading up to the 2016 U.S. election, Americans encountered an average of at least one to three disinformation stories per day. While the effects of disinformation are still largely unknown and controversial, there are a large number of cases where disinformation has had serious consequences. Although disinformation campaigns are a security challenge, "classic" military means of security policy are not an adequate response. By exploiting gray areas below the threshold of clear acts of war, a unified response by the international community is challenged. This raises the urgent question of how strategies against disinformation campaigns can be implemented in the practice of democracies. Many different approaches to countering disinformation campaigns already exist, but no sufficiently successful strategy has yet been found; a "model solution" does not seem to be within reach. One way to deal with hybrid threats is to rely heavily on the concept of resilience. States and their societies should acquire the ability to recover from attacks as quickly as possible. The goal is to create structures of reduced vulnerability. Nevertheless, it is clear that further and extensive research is needed to understand the challenge of disinformation campaigns and to find a targeted and appropriate way to deal with this phenomenon, which is likely to increase rather than decrease in the future, especially in light of emerging technologies such as AI.