There is currently a great deal of war-related noise, with colossal state budgets poured into new weapons such as aircraft, drones, aircraft carriers, missiles, and so on. Yet there is an invisible, far more subtle war being waged at the very level of the human mind. The human mind is like the captain of a ship. It gives direction and takes decisions. Victory therefore no longer plays out on a battlefield, but truly at the heart of our mental processes. How is this possible? Quite simply because we are now in the age of the information society. We are in fact flooded with information, and this information has an impact on the way we see the world and the way we make decisions. We can now speak of cognitive warfare, a term that is not well known, mainly used in military and geopolitical spheres. It designates a new dimension of modern conflicts, where the aim is no longer merely to dominate through military, economic or informational power, but to act directly on the perceptions, emotions and cognitive capacities of individuals and societies.
Let us look at what cognitive warfare is, how it can become an issue within psychosocial risks, and how we can strengthen ourselves as warriors in this little-discussed war.
I. What is cognitive warfare?
1. Definition
Cognitive warfare is the set of techniques and strategies aimed at influencing, manipulating, disrupting or controlling human mental processes (perceptions, judgements, memory, emotions, behaviours) in order to gain advantage in a conflict. It is sometimes described as a weapon of the brain, because the battlefield becomes the human psyche itself.
2. Origins and context
It stands in continuity with psychological warfare and information warfare (propaganda, fake news, cyber-influence).
Advances in neuroscience, AI, big data and the behavioural sciences now offer new tools to target individuals and crowds. Institutions such as NATO or certain think tanks now speak of “cognitive warfare” as a sixth domain of warfare (after land, sea, air, space and cyberspace).
3. Means used
– Disinformation and propaganda: massive use of social networks, deepfakes, fake news.
– Psychological micro-targeting: exploiting personal data to send ultra-targeted messages tailored to personality and emotions.
– Over-information and confusion: drowning the public in a contradictory flow to weaken its ability to discern truth from falsehood.
– Exploitation of cognitive biases: storytelling, shocking images, repetition, polarisation.
– Emerging technologies: conversational AIs, bots, augmented/virtual reality, and even military research into the impact of neuro-technological signals.
4. Objectives
– Destabilise trust: in institutions, the media, science, democracy.
– Divide societies: polarise opinion, create internal fractures.
– Influence decisions: electoral, economic, diplomatic.
– Weaken an adversary’s will to resist without firing a single shot.
5. Current examples
– Disinformation campaigns related to the Covid-19 pandemic.
– Use of troll armies and bot farms in conflicts (Ukraine, Syria, Taiwan, etc.).
– Electoral influence through micro-targeting (e.g., the Cambridge Analytica case).
– Proliferation of deepfakes in politics and the media.
6. Ethical and societal issues
– Risk of mass manipulation and loss of individual autonomy.
– Fragilisation of democracies in the face of authoritarian regimes.
– Moral dilemma: how far can psychology and neuroscience be used as weapons?
– The need to educate for critical thinking and cognitive resilience (media literacy, philosophy, soft skills).
In summary: cognitive warfare is the ultimate evolution of information warfare: it is no longer only information that is targeted, but the human mind itself.
II. Difference between information warfare, psychological warfare and cognitive warfare
1. Psychological warfare
An ancient practice (already found in Antiquity; Sun Tzu spoke of it).
Aim: to weaken the adversary’s morale, reducing their will to fight.
Classical means:
– Propaganda (posters, radio, leaflets during the world wars).
– Rumours, intoxication, threats, spreading fear.
– Speeches intended to demoralise or influence.
Target: emotions (fear, hope, anger).
2. Information warfare
A concept that developed especially with the digital era.
Aim: to control or alter available information in order to influence opinions and decisions.
Means:
– Fake news, media manipulation.
– Cyber-attacks on information systems.
– Control or censorship of communication channels.
Target: information flows (what people see, read, hear).
3. Cognitive warfare
A higher level, described by NATO as the “sixth domain of warfare.”
Aim: to act directly on individuals’ or societies’ mental processes.
Means:
– Exploitation of cognitive biases (confirmation, emotions).
– Behavioural micro-targeting (via personal data).
– Deepfakes, virtual realities, generative AIs to manipulate perception.
– Approaches drawn from neuroscience and psychology.
Target: the human brain itself (the capacity to think, reason, decide).
In summary:
– Psychological: act on emotions.
– Informational: act on data and the media environment.
– Cognitive: act on internal mechanisms of thought. We can say that cognitive warfare is the culmination of the other two, integrating technology, neuroscience and artificial intelligence.
III. Psychosocial risk for organisations
Employees are exposed to a constant flow of contradictory information (emails, social networks, news). This over-information can generate stress, mental fatigue and a loss of discernment, making decision-making at work difficult.
Disinformation campaigns can create fear, anger or mistrust towards the organisation, colleagues or hierarchy. Internal trust erodes, which harms team cohesion and cooperation.
If manipulated messages circulate (on social, political or health issues), they can divide employees according to their beliefs. This fosters interpersonal conflict and weakens the social climate.
Continuous exposure to anxiety-inducing or misleading content can provoke anxiety, a sense of insecurity, even emotional exhaustion. This increases the risks of burnout or psychological distress.
If employees doubt the reliability of information (internal or external), this creates a feeling of disorientation and loss of bearings. They may disengage, lose motivation and develop cynicism towards the organisation.
Cognitive-warfare techniques (fake news, deepfakes, trolls) can target not only the organisation but also individuals. An employee attacked online may suffer digital harassment that affects both professional and personal balance. In summary: cognitive warfare amplifies classic psychosocial risks (stress, isolation, overload, loss of meaning) by adding a new dimension: the intentional manipulation of perception and thought. Organisations must therefore integrate this threat into their psychosocial risk prevention policy by developing teams’ cognitive resilience (critical thinking, transparency, clear internal communication).
IV. Individual defence strategies
Develop critical thinking:
– Check sources before sharing information.
– Ask simple questions: Who is speaking? In whose interest? Can it be verified elsewhere?
Limit exposure to information bubbles:
– Vary sources (media, countries, languages, points of view).
– Beware of social-media algorithms that reinforce polarisation.
Recognise your cognitive biases:
– Example: confirmation bias (seeking what confirms our ideas).
– Being aware of them already reduces their manipulative power.
Manage your emotions:
– Cognitive warfare often targets fear, anger or indignation.
– Take a step back before reacting (do not click/share in the heat of the moment).
Digital hygiene:
– Protect your personal data (passwords, privacy).
– Avoid giving away too much exploitable information for micro-targeting.
Cultivate inner resilience:
– Meditation, sport, creativity, quality discussions.
– Anything that strengthens psychological balance makes one less manipulable.
Collective defences
Education and training:
– Integrate media literacy and critical thinking from school age.
– Train adults to identify fake news, deepfakes and manipulations.
Institutional transparency:
– Governments and media that own their mistakes and explain their choices reduce the room for manipulation.
Trusted media and fact-checking:
– Support independent verification initiatives.
– Make corrections visible and accessible.
Regulation of digital platforms:
– Regulate abusive uses of disinformation and deepfakes.
– Impose transparency on recommendation algorithms.
Social solidarity:
– Division is a key weapon of cognitive warfare.
– Encourage inter-generational, intercultural and political dialogue to avoid exploitable fractures.
Development of counter-strategies:
– States and NGOs can conduct their own prevention and awareness campaigns.
– NATO already speaks of “cognitive resilience” as a new defence priority. In summary: Individually, it is about disciplining the mind. Collectively, it is about strengthening social trust. True defence rests on a combination of personal lucidity and collective cohesion.
V. Organisational prevention strategies
1. Strengthen the culture of information:
– Implement regular training in critical thinking, media literacy and detection of fake news/deepfakes.
– Develop an internal information charter: how to verify, share and respond to questionable information.
– Create safe dialogue spaces where employees can ask questions about current events or circulating messages.
2. Ensure transparency and clarity in internal communication:
– Disseminate clear, regular and coherent internal communication to limit the space for disinformation.
– Explain the organisation’s decisions, including their limits, in order to maintain trust.
– Identify trusted points of contact (HR, managers, occupational health) whom employees can consult in case of doubt.
3. Train and support managers:
– Train managers to detect signs of cognitive stress (fatigue, confusion, polarisation of opinions).
– Equip them to handle sensitive debates (health, politics, current affairs) without allowing internal divisions to take root.
– Encourage caring management that supports dialogue and avoids escalation of tensions.
4. Develop collective cognitive resilience:
– Offer workshops on emotion regulation and psychological resilience.
– Encourage practices that foster mental stability: mindfulness, well-being activities, work–life balance.
– Create internal mutual-aid communities where employees can exchange and support one another.
5. Protect employees from cyber-harassment and digital attacks:
– Strengthen organisational cyber-security (data protection, awareness of phishing and manipulations).
– Provide psychological support to employees who are victims of smear campaigns or digital harassment.
– Establish a clear reporting procedure in the event of a targeted cognitive or media attack.
6. Integrate cognitive warfare into the psychosocial-risk policy:
– Include cognitive risk in the organisation’s psychosocial risk assessment.
– Collaborate with occupational physicians, psychologists and unions to create tailored prevention plans.
– Implement monitoring indicators (social climate, internal trust, incidents related to information).
Conclusion
An effective HR strategy against cognitive warfare must combine:
– Clear and reliable information (internal communication).
– Continuous training (critical thinking and manipulation detection).
– Managerial and psychological support (prevent stress and divisions).
– Digital protection (security and support in the event of an attack). The objective is to build a resilient organisation in which employees are able to recognise and resist attempts at cognitive manipulation, while maintaining internal trust and cohesion.