March 11, 2026No Comments

AI, ICT infrastructures, and social media in today’s conflicts

By Giulia Saccone - AI, Cyberspace and Space Team

Introduction

The XXI century seems to be characterised by a redefinition of the international security scenario. Warfare is no exception. In this decade, the rapid evolution in the cybersphere, Artificial Intelligence (AI) and disinformation techniques has led a wide range of experts to gradually focus on the impact of these technologies on human decision-making. This has led to the emergence of a new concept: cognitive warfare.

While there is no univocal definition of it, the one provided by NATO's Allied Command Transformation (ACT), under the input of the Science and Technology Organisation (STO) in 2024 is as follows:

โ€œCognitive Warfare integrates cyber, information, psychological, and social engineering capabilities. These activities, conducted in synchronisation with other Instruments of Power, can affect attitudes and behavior by influencing, protecting, or disrupting individual and group cognition to gain advantage over an adversary.โ€

According to the 2025 Chief Scientist Research Report, we can classify cognitive warfare as a standalone grey-zone, military and social issue. In this type of conflict, technology is the catalyst for its reach and effectiveness, and is also part of the solution to counteract cognitive campaigns, along with further understanding of the threat actors, information environment, and social implications of the phenomenon. This definition served as a notable input for further studies, but especially as legitimisation of this new warfare domain, allowing the development of a corresponding doctrine, and shifting the focus of psychological operations from the content to its effects.

In understanding cognitive warfare, we must point out the differences with its sibling: information warfare. Information warfareย focuses on controllingย disinformation and misinformation flows in their various forms, taking advantage of technologies without changing the nature of war. On the contrary, cognitive warfare aims at eliciting a psychological reaction, leveraging both technologies and neuroscience, involving information and activities that can take place online and offline.1 The different focus avoids a misinterpretation of cognitive warfare as โ€œa rebrand of an old conceptโ€, helping us to understand how technology is exploited.

How Cognitive Warfare is conducted

To evoke precise reactions, cognitive warfare triggers pathways that manage cognitive load, such as cognitive biases and heuristics (i.e., predicting outcomes by interpreting data inductively or through analogies), as well as emotional responses2. Taking into account the OODA loop (observe-orient-decide-act), cognitive warfare techniques affect the orientation step: the moment when information is filtered, analysed, and interpreted through prior experiences, analytical and synthetic strategies, and cultural features.

To achieve this, antagonistic actors expose targets to vivid, repetitive, and biased information, distorting heuristic reasoning, especially during uncertain times, causing people to misjudge the likelihood of events based on superficial similarities, neglect objective facts, and make anxious or irrational decisions3. These effects are exacerbated by the anchoring bias: the first-hand exposure to a variable that will condition all the subsequent evaluations.

This bias might appear similar to the priming effect, which has a different outcome. It consists of exposing an individual to the association between a subject and a certain set of characteristics, which, through association mechanisms, profoundly shapes the perception of the subject. This mechanism is effective in manipulating public opinion, since the repeated exposure to the association between a characteristic and a subject leads to an overreaction of the general public against the alleged antagonist, even when the subject does not clearly present that attribute4.

The diffusion of false narratives can also impact the confirmation bias: our tendency to privilege information that confirms our initial beliefs. This is particularly useful in radicalisation processes, which elicit an emotive response on the subject and deepen cognitive divides among groups, eroding social cohesion, which can then be exploited by malign actors against institutions5.

The event that marked the beginning of the exploitation of cognitive warfare, and well exemplifies its functioning, is the 2014 Russian annexation of Crimea, where Russian forces instrumentalised historical facts, legal ambiguities and exacerbation of political divide through the support of the Russian minority, which was leveraged to erode social cohesion, undermining institutions and confusing public international opinion on the interpretation of the events6. While Russiaโ€™s information campaigns are among the most studied examples, cognitive influence operations are conducted by a wide range of state and non-state actors.


Technological enablers of cognitive warfare: AI, ICT infrastructures and cyberattacks to undermine trust

The shift from hybrid to cognitive warfare is enabled by the rising centrality of the Information and Communication Technologies (ICT) infrastructures in social processes, and the advent of AI-powered data mining, algorithmic profiling, and deepfakes. In cognitive warfare, ICT infrastructures are deployed as a vector for infodemic campaigns, taking advantage of the rising use of social media as the main source of information.

A fundamental characteristic of ICT infrastructures is the speed and variety of news diffusion. This creates an infodemic environment that gradually weakens cognitive processes through information overload, creating uncertainty and consequential regression to heuristic reasoning ruled by biases. Fake news proliferates on social media, thanks to their algorithm-friendly design, which allows them to be omnipresent and function as an anchoring bias for distorted facts and narratives, molding the perception of the individual who is constantly driven to this cognitive-overloading environment7. Algorithms also help social polarisation and ideological manipulation: a recent study on the algorithm of X has demonstrated that its algorithmsย boost the engagement of inflammatory posts, expanding the role of social networks from enablers of cognitive warfare to active players8.

Anonymity also plays a role in cognitive load due to the time-consuming practice of verification of facts that increases the cognitive cost-to-scale and hence gets avoided by users9. Another effectiveย instrument is the use of social media influencers, thanks to their friendliness, relatability, interaction frequency, and capacity to create parasocial relations similar to friendships10, along with their capacity to convey emotion-driven, yet credible messages, they can become enablers of confirmation bias and tools of cognitive warfare, as in the case of Russian interference with Romanian elections in 2024.

The outreach of malicious influencers and the pervasiveness of bots and troll farms are maximised by the increasing sophistication of AI-based content,11 which is rapidly and progressively blurring the distinction between real and AI-generated content and profiles. Bots and troll farms were among the first applications of AI for cognitive warfare, which, thanks to their characteristic inflammatory language employed directly towards users, are optimal tools for controlling the narrative. They are often employed during geopolitical events to control the narrative and influence public support for electoral outcomes, consultative democratic processes, direct democracy, policy decisions, alliances, and traditional media12.

Image by emerson23work on Pixabay

Indeed, AI is a perfect force multiplier of cognitive warfare, enabled by relentless data mining, which enables targeting individuals based on their preferred content13 and personalities at a superhuman speed.
Data are then operationalised to produce information that targets and elicits every individualโ€™s personal bias, and through the cognitive effect of an infodemic environment, impairs effective elaboration of external data, leading us to instinctive reactions14.

The emergence of the metaverse could be the next enhancer of cognitive warfare: further blurring the border between digital and physical reality, it allows the collection of biometric data through wearable devices. Malicious forces can collect them to create a more precise profile of a userโ€™s reaction to certain stimuli and modify the circumstantial scenario in which they are immersed, creating another domain for psychological operations15. However, despite the attention given by the research on cognitive warfare, studies suggest that these predictions are not coherent with the current maturity and diffusion of this technology.16


Conclusions

The article aims to trace how ICT infrastructure, social media and AI operate on our cognitive functions within the context of cognitive warfare, affecting how information is filtered, analysed, and interpreted through prior experiences, analytical and synthetic strategies, and cultural features. The emergence of this new dimension of conflict has caught the attention of scholars from psychological, international relations, war studies, and numerous other fields, with the 2021 NATO definition contributing to the conceptualisation and legitimisation of this phenomenon.

The cognitive domain has increasingly been described as a stand-alone type of warfare that situates itself within the grey-zone spectrum, involving both the military and civil society. It distinguishes itself from information warfare since it aims not only to control information flows but also to manipulate information in order to distort our perception of events. One of the earliest examples is the 2014 Russian annexation of Crimea. In this context, as well as in the 2016 American elections, distorted information was disseminated by exploiting the characteristics of ICT infrastructures: anonymity and rapid diffusion, as well as the algorithmic dynamics of social media and the use of troll farms to influence individualsโ€™ perception of reality.

These examples illustrate how digital technologies have enabled the expansion of cognitive warfare, further amplified by data mining and AI-driven personalisation, which are progressively blurring the distinction between authentic and fabricated content.

As digital ecosystems become increasingly central to political and social life, cognitive warfare is likely to become a persistent feature of geopolitical competition. This raises important questions for democratic resilience, including the need for stronger media literacy, improved platform governance, and more effective mechanisms to detect and counter coordinated influence operations.


References:

  1. Hung, Tzu-Chieh, and Tzu-Wei Hung. โ€œHow Chinaโ€™s Cognitive Warfare Works: A Frontline Perspective of Taiwanโ€™s Anti-Disinformation Wars.โ€ย Journal of Global Security Studiesย 7, no. 4 (2022): ogac016.ย https://doi.org/10.1093/jogss/ogac016.
    Marsili, Marco. โ€œCognitive Warfare in Historical Perspective: From Cold War Psychological Operations to AI-Driven Information Campaigns.โ€ Preprint, Social Sciences, December 17, 2025.ย https://doi.org/10.20944/preprints202512.1596.v1. โ†ฉ๏ธŽ
  2. Hung and Hung (202) โ†ฉ๏ธŽ
  3. Kim, Daeun.ย Psychological Mechanisms of Cognitive Warfareย  on Decision-Making. 27, no. 2 (2025): 249โ€“66. โ†ฉ๏ธŽ
  4. Ibidem. โ†ฉ๏ธŽ
  5. Ibidem
    Hung and Hung (2022)
    Deppe, Christoph, and Gary S. Schaal. โ€œCognitive Warfare: A Conceptual Analysis of the NATO ACT Cognitive Warfare Exploratory Concept.โ€ย Frontiers in Big Dataย 7 (November 2024): 1452129.ย https://doi.org/10.3389/fdata.2024.1452129. โ†ฉ๏ธŽ
  6. Marsili, Marco. โ€œCognitive Warfare in Historical Perspective: From Cold War Psychological Operations to AI-Driven Information Campaigns.โ€ Preprint, Social Sciences, December 17, 2025.ย https://doi.org/10.20944/preprints202512.1596.v1.
    ย Danet (2019)
    โ†ฉ๏ธŽ
  7. Datta, Pratim, Mark Whitmore, and Joseph K. Nwankpa. โ€œA Perfect Storm: Social Media News, Psychological Biases, and AI.โ€ย Digital Threats: Research and Practiceย 2, no. 2 (2021): 1โ€“21.ย https://doi.org/10.1145/3428157.
    Ferreira, Vinรญcius Marques Da Silva, Carlos Alberto Nunes Cosenza, Alfredo Nazareno Pereira Boente, et al.ย โ€œGUERRA COGNITIVA NAS REDES SOCIAIS: AMEAร‡AS, DESAFIOS E IMPLICAร‡ร•ES PARA A SOCIEDADE.โ€ย ARACรŠย 7, no. 3 (2025): 14287โ€“303.ย https://doi.org/10.56238/arev7n3-240. โ†ฉ๏ธŽ
  8. Gauthier, Germain, Roland Hodler, Philine Widmer, and Ekaterina Zhuravskaya. โ€œThe Political Effects of Xโ€™s Feed Algorithm.โ€ย Nature, ahead of print, February 18, 2026.ย https://doi.org/10.1038/s41586-026-10098-2. โ†ฉ๏ธŽ
  9. Datta et al. (2021). โ†ฉ๏ธŽ
  10. Kim and Kimย (2022) โ†ฉ๏ธŽ
  11. Fenstermacher, Laurie H., David Uzcha, Kathleen G. Larson, Christine A. Vitiello, and Stephen M. Shellman. โ€œNew Perspectives on Cognitive Warfare.โ€ Inย Signal Processing, Sensor/Information Fusion, and Target Recognition XXXII, edited by Lynne L. Grewe, Erik P. Blasch, and Ivan Kadar.ย SPIE, 2023.ย https://doi.org/10.1117/12.2666777.. โ†ฉ๏ธŽ
  12. Paziuk, Andrii, Dmytro Lande, Elina Shnurko-Tabakova, and Phillip Kingston. โ€œDecoding Manipulative Narratives in Cognitive Warfare: A Case Study of the Russia-Ukraine Conflict.โ€ย Frontiers in Artificial Intelligenceย 8 (September 2025): 1566022.ย https://doi.org/10.3389/frai.2025.1566022.
    Da Silva et al. (2025). โ†ฉ๏ธŽ
  13. Marsili 2025
    Fenstermacher, Laurie H., David Uzcha, Kathleen G. Larson, Christine A. Vitiello, and Stephen M. Shellman. โ€œNew Perspectives on Cognitive Warfare.โ€ Inย Signal Processing, Sensor/Information Fusion, and Target Recognition XXXII, edited by Lynne L. Grewe, Erik P. Blasch, and Ivan Kadar. SPIE, 2023.ย https://doi.org/10.1117/12.2666777.. โ†ฉ๏ธŽ
  14. Merilรคinen, Niina. โ€œArtificial Intelligence as a Tool in Cognitive Warfare on Digital Platforms.โ€ย International Conference on AI Researchย 5, no. 1 (2025): 306โ€“12.ย https://doi.org/10.34190/icair.5.1.4353. โ†ฉ๏ธŽ
  15. Marsili, Marco. โ€œGuerre ร  La Carte: Cyber, Information, Cognitive Warfare and the Metaverse.โ€ย Applied Cybersecurity & Internet Governanceย 2, no. 1 (2023): 1โ€“11.ย https://doi.org/10.60097/ACIG/162861.
    Fenstermacher et al (2023). โ†ฉ๏ธŽ
  16. Liu, Zhiguo, Yan Huang, Junyu Mai, Wei Li, Zhipeng Cai, and Yingshu Li. โ€œIs the Metaverse Really Coming to Fruition? A Survey of Applied Metaverse and Extended Reality.โ€ย High-Confidence Computing, December 2025, 100376.ย https://doi.org/10.1016/j.hcc.2025.100376. โ†ฉ๏ธŽ

May 31, 2022No Comments

Lisa Gaufman on Russia’s Information Warfare in Ukraine

Dr. Lisa Gaufman received her PhD from the University of Tรผbingen, Germany, in 2016. She then joined the Institute for Intercultural and International Studies at the University of Bremen as a postdoctoral research fellow. She is the author of "Security Threats and Public Perception: Digital Russia and the Ukraine Crisis" (Palgrave, 2017). Her research interests centre around the intersection of political theory, international relations, media and cultural studies.

In this podcast interview, Dr. Lisa Gaufman explores in-depth the development of Russian strategy of information warfare in Ukraine.

Interviewing Team: Fabrizio Napoli and Davide Gobbicchi.

May 9, 2022No Comments

Psychological Warfare in a Changing World

By: Danilo delle Fave and Marco Verrocchio.

โ€œSupreme excellence consists in breaking the enemyโ€™s resistance without fightingโ€. This sentence of Sun Tzu well summarizes the meaning of Psychological warfare Psychological Operations (PSYOPS) includes all the psychological techniques to influence the behaviour, emotions and reasons of the target (governments, armies, companies, etc..) to affect its results. Three types of PSYOPs are distinguishable: Tactical, Operational and Strategic. While the Tactical PSYOPS are used during a conflict, the Operational ones are used also prior and after a military operation, to assist the commanderโ€™s plans. 

At the same time, Strategic PSYOPS comprehend all activities a government can pursue to influence foreign attitude, perceptions and behaviour. In other words, PSYOPS are essentially conceived as in support of other operations, trying to maximize the profits with the least effort possible. Many communication tools are used in PSYOPS, from leaflets to social networks. Therefore, efficient communications and practical messages are essential for the outcomes of the operations. Although Psychological warfare was used sinceย ancient times, it became more effective when communication allowed a constant persuasion of the enemy or its civilian population.ย 

Not surprisingly, the first PSYOPS occurred during WWII through radio stations. Axis forces and Allies created specific programs broadcasted regularly on FM and AM frequencies to discourage enemyโ€™s population and send fake news. Some of these operations were effective. For example,ย Radio London, a BBC program, successfully sent messages to the Italian Resistance in occupied Italy. After WWII, PSYOPS began to be extensively used.ย In Korea, a special division of the US Command was created to convince soldiers to lay down their weapons and prevent South Koreans from supporting the enemy. PSYOPS operations worked closely with Marines in the First Gulf War, achieving significant results. For instance, the 9th PSYOP Battalion facilitatedย the surrender of 1.405 Iraqi troopsbesieged on an island, just sending helicopters with aerial loudspeakers.ย 

Image Source: https://upload.wikimedia.org/wikipedia/commons/thumb/0/01/US_Army_soldier_hands_out_a_newspaper_to_a_local_Aug_2004.jpg/1280px-US_Army_soldier_hands_out_a_newspaper_to_a_local_Aug_2004.jpg

Since the end of the Cold War, Psychological warfare has changed dramatically.ย PSYOPS have become instruments of NATO. According to theย Allied Joint Doctrine for Psychological Operationsย (NATO PSYOPS), they are considered as: โ€œPlanned psychological activities using methods of communications and other means directed to approved audiences in order to influence perceptions, attitudes and behaviour, affecting the achievement of political and military objectivesโ€.ย This is a definition that fits better with the current status of the PSYOPS, and it was developedย during โ€œthe war on terrorโ€,where coordination with US allies was fundamental. Nowadays, the borders among tactical, operational and strategic PSYOPS are blurring due to the global scale and the simultaneous nature of the information. Moreover, in the current high-intensity warfare scenarios, Tactical PSYOPS are more dangerous than Strategic and Operational ones due to more precise guided weapons. A solution to the drawbacks of Tactical PSYOPS is coming from technology. For example, in the war in Ukraine, leaflets have been replaced byย SMS text messagesย and audio ones, usingย Orlan-10 dronesย like jammers to spread information directly on the cellphones of Ukrainian soldiers.

In the battle to conquer the โ€œhearts and mindsโ€, both the United States and the Russian Federation had developed a proper doctrine for psyops. In the U.S doctrine psychological operations are a peculiar type ofย Information Operations, which represent the integrated employment of Information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision making of adversaries and potential adversaries while protecting their own. These Information-related capabilities can be carried on as military information support operations, cyberspace operations, electronic warfare, military deception, civil military operation and even public affairs. In this framework theย U.S. doctrineย states that psyops are driven primarily in the effort of influencing foreign target audiences through the development of massages and devise actions in order to change those groupsโ€™ attitude and behaviors.

Psychological operations are therefore carried on as military information support operations that can also degrade the enemyโ€™s combat power, reduce civilian interference, minimize collateral damage, and increase the populationโ€™s support for operations. The key aspect of U.S. doctrine is represented by the dissemination of persuasive messages based on true information. False information is considered a double-edge weapon and risks being counter-productive to the long-term credibility and success of psychological operations.

Theย Russian doctrineย instead adopts a different approach: they do not consider a liability the use of false information and have also developed a military doctrine that prescribes the implementation of psyops even in times of peace.ย Based upon the Soviet tradition of psychological operations, the so-calledย spetspropaganda, the Russian doctrine has developed the so-called โ€œnew generation warfareโ€. It is centered around the idea that current menances to the Russian Federation come from the โ€œinformation sphereโ€ and call for the strengthening of Russian capabilities in psychological operations. As in the Soviet time, the two pillars of Information warfare are reflexive control andย active measures.ย Reflexive controlis the manipulation of the decision making process of the enemy by altering key factors in the adversaryโ€™s perception of the world and causing him to choose the actions most advantageous to Russian objectives. Active measures are all kinds of operations that aim to influence, undermine, disrupt and discredit targeted countries, their institutions and non governmental organizations.

Moreover, the Russian doctrine considers information warfare a crucial element of modern warfare, therefore they have promoted different types of psyops. This explains why Russian information warfare and political warfare clearly overlaps, and despite the prominence of non military actors such as the SVR and the FSB in political warfare, deceit and propaganda, theย GRU, the military secret service, is also a major player in the Kremlin strategy to pursue hybrid warfare.ย One of the most paradigmatic ways in which Russian forces organize psyops is the troll farm, through which they exploit social media algorithms and key demographics that can be vulnerable to their propaganda, in order to obtain a cheap way to carry on these kinds of operations.ย 

(Graphic from National Security Analysis Department, โ€œLittle Green Menโ€: A Primer on Modern Russian Unconventional Warfare, Ukraine 2013-2014, Assessing Revolutionary and Insurgent Strategies Study , 18).