By Giulia Saccone - AI, Cyberspace and Spaceย Desk
Introduction
The XXI century seems to be characterised by a redefinition of the international security scenario. Warfare is no exception. In this decade, the rapid evolution in the cybersphere, Artificial Intelligence (AI) and disinformation techniques has led a wide range of experts to gradually focus on the impact of these technologies on human decision-making. This has led to the emergence of a new concept: cognitive warfare.
While there is no univocal definition of it, the one provided by NATO's Allied Command Transformation (ACT), under the input of the Science and Technology Organisation (STO) in 2024 is as follows:
โCognitive Warfare integrates cyber, information, psychological, and social engineering capabilities. These activities, conducted in synchronisation with other Instruments of Power, can affect attitudes and behavior by influencing, protecting, or disrupting individual and group cognition to gain advantage over an adversary.โ
According to the 2025 Chief Scientist Research Report, we can classify cognitive warfare as a standalone grey-zone, military and social issue. In this type of conflict, technology is the catalyst for its reach and effectiveness, and is also part of the solution to counteract cognitive campaigns, along with further understanding of the threat actors, information environment, and social implications of the phenomenon. This definition served as a notable input for further studies, but especially as legitimisation of this new warfare domain, allowing the development of a corresponding doctrine, and shifting the focus of psychological operations from the content toย its effects.
In understanding cognitive warfare, we must point out the differences with its sibling: information warfare. Information warfareย focuses on controllingย disinformation and misinformation flows in their various forms, taking advantage of technologies without changing the nature of war. On the contrary, cognitive warfare aims at eliciting a psychological reaction, leveraging both technologies and neuroscience, involving information and activities that can take place online and offline.1 The different focus avoids a misinterpretation of cognitive warfare as โa rebrand of an old conceptโ, helping us to understand how technology is exploited.
How Cognitive Warfare is conducted
To evoke precise reactions, cognitive warfare triggers pathways that manage cognitive load, such as cognitive biases and heuristics (i.e., predicting outcomes by interpreting data inductively or through analogies), as well as emotional responses2. Taking into account the OODA loop (observe-orient-decide-act), cognitive warfare techniques affect the orientation step: the moment when information is filtered, analysed, and interpreted through prior experiences, analytical and synthetic strategies, and cultural features.
To achieve this, antagonistic actors expose targets to vivid, repetitive, and biased information, distorting heuristic reasoning, especially during uncertain times, causing people to misjudge the likelihood of events based on superficial similarities, neglect objective facts, and make anxious or irrational decisions3. These effects are exacerbated by the anchoring bias: the first-hand exposure to a variable that will condition all the subsequent evaluations.
This bias might appear similar to the priming effect, which has a different outcome. It consists of exposing an individual to the association between a subject and a certain set of characteristics, which, through association mechanisms, profoundly shapes the perception of the subject. This mechanism is effective in manipulating public opinion, since the repeated exposure to the association between a characteristic and a subject leads to an overreaction of the general public against the alleged antagonist, even when the subject does not clearly present that attribute4.
The diffusion of false narratives can also impact the confirmation bias: our tendency to privilege information that confirms our initial beliefs. This is particularly useful in radicalisation processes, which elicit an emotive response on the subject and deepen cognitive divides among groups, eroding social cohesion, which can then be exploited by malign actors against institutions5.
The event that marked the beginning of the exploitation of cognitive warfare, and well exemplifies its functioning, is the 2014 Russian annexation of Crimea, where Russian forces instrumentalised historical facts, legal ambiguities and exacerbation of political divide through the support of the Russian minority, which was leveraged to erode social cohesion, undermining institutions and confusing public international opinion on the interpretation of the events6. While Russiaโs information campaigns are among the most studied examples, cognitive influence operations are conducted by a wide range of state and non-state actors.
Technological enablers of cognitive warfare: AI, ICT infrastructures and cyberattacks to undermine trust
The shift from hybrid to cognitive warfare is enabled by the rising centrality of the Information and Communication Technologies (ICT) infrastructures in social processes, and the advent of AI-powered data mining, algorithmic profiling, and deepfakes. In cognitive warfare, ICT infrastructures are deployed as a vector for infodemic campaigns, taking advantage of the rising use of social media as the main source of information.
A fundamental characteristic of ICT infrastructures is the speed and variety of news diffusion. This creates an infodemic environment that gradually weakens cognitive processes through information overload, creating uncertainty and consequential regression to heuristic reasoning ruled by biases. Fake news proliferates on social media, thanks to their algorithm-friendly design, which allows them to be omnipresent and function as an anchoring bias for distorted facts and narratives, molding the perception of the individual who is constantly driven to this cognitive-overloading environment7.
Anonymity also plays a role in cognitive load due to the time-consuming practice of verification of facts that increases the cognitive cost-to-scale and hence gets avoided by users8. Another effectiveย instrument is the use of social media influencers, thanks to their friendliness, relatability, interaction frequency, and capacity to create parasocial relations similar to friendships9, along with their capacity to convey emotion-driven, yet credible messages, they can become enablers of confirmation bias and tools of cognitive warfare, as in the case of Russian interference with Romanian elections in 2024.
The outreach of malicious influencers and the pervasiveness of bots and troll farms are maximised by the increasing sophistication of AI-based content,10 which is rapidly and progressively blurring the distinction between real and AI-generated content and profiles. Bots and troll farms were among the first applications of AI for cognitive warfare, which, thanks to their characteristic inflammatory language employed directly towards users, are optimal tools for controlling the narrative. They are often employed during geopolitical events to control the narrative and influence public support for electoral outcomes, consultative democratic processes, direct democracy, policy decisions, alliances, and traditional media11.

Indeed, AI is a perfect force multiplier of cognitive warfare, enabled by relentless data mining, which enables targeting individuals based on their preferred content12 and personalities at a superhuman speed.
Data are then operationalised to produce information that targets and elicits every individualโs personal bias, and through the cognitive effect of an infodemic environment, impairs effective elaboration of external data, leading us to instinctive reactions13.
The emergence of the metaverse could be the next enhancer of cognitive warfare: further blurring the border between digital and physical reality, it allows the collection of biometric data through wearable devices. Malicious forces can collect them to create a more precise profile of a userโs reaction to certain stimuli and modify the circumstantial scenario in which they are immersed, creating another domain for psychological operations14. However, despite the attention given by the research on cognitive warfare, studies suggest that these predictions are not coherent with the current maturity and diffusion of this technology.15
Conclusions
The article aims to trace how ICT infrastructure, social media and AI operate on our cognitive functions within the context of cognitive warfare, affecting how information is filtered, analysed, and interpreted through prior experiences, analytical and synthetic strategies, and cultural features. The emergence of this new dimension of conflict has caught the attention of scholars from psychological, international relations, war studies, and numerous other fields, with the 2021 NATO definition contributing to the conceptualisation and legitimisation of this phenomenon.
The cognitive domain has increasingly been described as a stand-alone type of warfare that situates itself within the grey-zone spectrum, involving both the military and civil society. It distinguishes itself from information warfare since it aims not only to control information flows but also to manipulate information in order to distort our perception of events. One of the earliest examples is the 2014 Russian annexation of Crimea. In this context, as well as in the 2016 American elections, distorted information was disseminated by exploiting the characteristics of ICT infrastructures: anonymity and rapid diffusion, as well as the algorithmic dynamics of social media and the use of troll farms to influence individualsโ perception of reality.
These examples illustrate how digital technologies have enabled the expansion of cognitive warfare, further amplified by data mining and AI-driven personalisation, which are progressively blurring the distinction between authentic and fabricated content.
As digital ecosystems become increasingly central to political and social life, cognitive warfare is likely to become a persistent feature of geopolitical competition. This raises important questions for democratic resilience, including the need for stronger media literacy, improved platform governance, and more effective mechanisms to detect and counter coordinated influence operations.
References:
- Hung, Tzu-Chieh, and Tzu-Wei Hung. โHow Chinaโs Cognitive Warfare Works: A Frontline Perspective of Taiwanโs Anti-Disinformation Wars.โย Journal of Global Security Studiesย 7, no. 4 (2022): ogac016.ย https://doi.org/10.1093/jogss/ogac016.
Marsili, Marco. โCognitive Warfare in Historical Perspective: From Cold War Psychological Operations to AI-Driven Information Campaigns.โ Preprint, Social Sciences, December 17, 2025.ย https://doi.org/10.20944/preprints202512.1596.v1. โฉ๏ธ - Hung and Hung (202) โฉ๏ธ
- Kim, Daeun.ย Psychological Mechanisms of Cognitive Warfareย on Decision-Making. 27, no. 2 (2025): 249โ66. โฉ๏ธ
- Ibidem. โฉ๏ธ
- Ibidem
Hung and Hung (2022)
Deppe, Christoph, and Gary S. Schaal. โCognitive Warfare: A Conceptual Analysis of the NATO ACT Cognitive Warfare Exploratory Concept.โย Frontiers in Big Dataย 7 (November 2024): 1452129.ย https://doi.org/10.3389/fdata.2024.1452129. โฉ๏ธ - Marsili, Marco. โCognitive Warfare in Historical Perspective: From Cold War Psychological Operations to AI-Driven Information Campaigns.โ Preprint, Social Sciences, December 17, 2025.ย https://doi.org/10.20944/preprints202512.1596.v1.
ย
ย Danet (2019)
โฉ๏ธ - Datta, Pratim, Mark Whitmore, and Joseph K. Nwankpa. โA Perfect Storm: Social Media News, Psychological Biases, and AI.โย Digital Threats: Research and Practiceย 2, no. 2 (2021): 1โ21.ย https://doi.org/10.1145/3428157.
Ferreira, Vinรญcius Marques Da Silva, Carlos Alberto Nunes Cosenza, Alfredo Nazareno Pereira Boente, et al.ย โGUERRA COGNITIVA NAS REDES SOCIAIS: AMEAรAS, DESAFIOS E IMPLICAรรES PARA A SOCIEDADE.โย ARACรย 7, no. 3 (2025): 14287โ303.ย https://doi.org/10.56238/arev7n3-240. โฉ๏ธ - Datta et al. (2021). โฉ๏ธ
- Kim and Kimย (2022) โฉ๏ธ
- Fenstermacher, Laurie H., David Uzcha, Kathleen G. Larson, Christine A. Vitiello, and Stephen M. Shellman. โNew Perspectives on Cognitive Warfare.โ Inย Signal Processing, Sensor/Information Fusion, and Target Recognition XXXII, edited by Lynne L. Grewe, Erik P. Blasch, and Ivan Kadar.ย SPIE, 2023.ย https://doi.org/10.1117/12.2666777.. โฉ๏ธ
- Paziuk, Andrii, Dmytro Lande, Elina Shnurko-Tabakova, and Phillip Kingston. โDecoding Manipulative Narratives in Cognitive Warfare: A Case Study of the Russia-Ukraine Conflict.โย Frontiers in Artificial Intelligenceย 8 (September 2025): 1566022.ย https://doi.org/10.3389/frai.2025.1566022.
Da Silva et al. (2025). โฉ๏ธ - Marsili 2025
Fenstermacher, Laurie H., David Uzcha, Kathleen G. Larson, Christine A. Vitiello, and Stephen M. Shellman. โNew Perspectives on Cognitive Warfare.โ Inย Signal Processing, Sensor/Information Fusion, and Target Recognition XXXII, edited by Lynne L. Grewe, Erik P. Blasch, and Ivan Kadar. SPIE, 2023.ย https://doi.org/10.1117/12.2666777.. โฉ๏ธ - Merilรคinen, Niina. โArtificial Intelligence as a Tool in Cognitive Warfare on Digital Platforms.โย International Conference on AI Researchย 5, no. 1 (2025): 306โ12.ย https://doi.org/10.34190/icair.5.1.4353. โฉ๏ธ
- Marsili, Marco. โGuerre ร La Carte: Cyber, Information, Cognitive Warfare and the Metaverse.โย Applied Cybersecurity & Internet Governanceย 2, no. 1 (2023): 1โ11.ย https://doi.org/10.60097/ACIG/162861.
Fenstermacher et al (2023). โฉ๏ธ - Liu, Zhiguo, Yan Huang, Junyu Mai, Wei Li, Zhipeng Cai, and Yingshu Li. โIs the Metaverse Really Coming to Fruition? A Survey of Applied Metaverse and Extended Reality.โย High-Confidence Computing, December 2025, 100376.ย https://doi.org/10.1016/j.hcc.2025.100376. โฉ๏ธ












