Four years of war and disinformation in Ukraine: from videos taken out of context to the use of AI on the battlefield
Since Russia's invasion of Ukraine on 24 February 2022, hoaxes and successive disinformation campaigns have been a constant, especially in Europe. In #UkraineFacts, the collaborative database promoted by _Maldita.es_ , in which more than 100 fact-checking organisations from 88 countries around the world participated,**more than 3,000 examples of disinformation** on this topic were reported. The **first narratives** spread by Russian propaganda tried to**justify this offensive** , while in the last months of 2025, their goal is to **reduce international mobilisation** in Ukrainian ranks.
**Artificial intelligence now plays a key role in disinformation** about the war in Ukraine, attempting to manipulate not only citizens but also future combatants. Before this technology reached its current level of perfection and the Kremlin applied it to its disinformation campaigns, one of the most commonly used strategies for this purpose was the **dissemination of decontextualised or manipulated images** , and even images taken from video games, as could be seen in the months following the invasion.
[📲 ¡Pincha aquí y sigue el canal de WhatsApp de Maldita.es
para que no te la cuelen! ](https://bit.ly/3EGa2ju)
## Images taken out of context: the first content that fact-checkers faced after the invasion
On 24 February 2022, Europe woke up to the news of the invasion of Ukraine. **Disinformation was quick to arrive**. That same day, videos were already circulating of supposed aircraft flying over the attacked country that had actually been recorded in Russia two years earlier, as well as images from video games presented as if they were bombings. Since then, images taken out of context to confuse the population have been a constant.
**Eight fact-checking organisations from three continents** contacted for this article agree that the dissemination of old or decontextualised videos and images was**one of the strategies most used** by disinformers in the first months after the invasion. “**Images from video games, past conflicts and 3D animations** were shared as if they were from the ongoing war”, Ali Osman Arabaci, editor-in-chief of the Turkish fact-checking organisation Teyit, explains to _Maldita.es_. One example is the video of a Ukrainian attack with Molotov cocktails against Russian tanks, which was circulated at the time as if it were current.
One of the disinformations most frequently debunked by the #UkraineFacts project fact-checkers a year after the war began was **a video of a protest in Vienna****recorded 20 days before the invasion of Ukraine began** , which was used to deny the number of deaths in the war. Filipe Pardal, from the Portuguese fact-checking organisation Polígrafo, explains that even years later, “**old images continue to be misused** to represent current events in the war”.
## The first disinformation narratives tried to justify the invasion or attacked Ukrainian refugees
The first disinformation narratives that circulated after the Russian offensive tried to justify the invasion. One of the most repeated between May and June 2022 claimed that “if Russia had not launched a special operation, Ukraine would have attacked first”. Vladimir Putin, President of Russia, also justified the invasion as a defence against a supposed threat from NATO, a narrative that has been reinforced throughout the conflict. Another (also promoted by Putin) claimed that, with the invasion, Russia aimed to “stop an **supposed genocide in the Donbas region** ”, explains Filipe Pardal. The invasion as **a pre-emptive strike to****destroy supposed biological laboratories** in Ukraine is another of the narratives that circulated in March 2022.
**Content describing Ukraine as a Nazi and Russophobic state** was also circulated. According to EUvsDisinfo, part of the European External Action Service, ‘the myth of a Ukraine under Nazi rule has been the keystone of Russian disinformation about the country since the start of the Euromaidan protests in 2013–2014’.
Another **wave of misinformation** focused on **Ukrainian refugees**. According to the latest data from the United Nations High Commissioner for Refugees (UNHCR) compiled by RTVE, **nearly 5.9 million people have left Ukraine****because of the conflict**. The Ukrainian organisation Detector Media analysed more than 35,000 social media posts about Ukrainian refugees published between February and September 2022. According to this research, some of the most repeated narratives at the time claimed tha**t they were “destroying Europe”, that they are people who “have privileges” and that, moreover, “they do not want to work”**.
¿Crees que te ha llegado un bulo? Verifica en nuestro chatbot de WhatsApp (+34 644 229 319)
## Four years later, disinformation tries to impact international mobilisation in Ukrainian ranks
As we reported in a cross-border investigation led by _Maldita.es_ , in which other fact-checking organisations from Europe and Latin America participated, in the last months of 2025, Russian disinformation and propaganda campaigns had one main objective: to **discourage volunteers from enlisting in the Ukrainian army**. To this end, they mainly disseminated content about soldiers. From alleged mass losses on the Ukrainian side to **promote the idea that Russia is in a position of superiority in the war** , to supposed forced mobilisations, recruitment campaigns and narratives claiming that they do not want to pay the families of victims in order to **discredit the image of Zelensky's government**.
These disinformation campaigns**targeted several countries** , especially Latin America and, specifically, **Colombia**(where a large number of foreign volunteers fighting in Ukraine come from). It is an example of how **the Kremlin adapts specific narratives to specific audiences** with the aim of **“satisfying cultural, social, economic and other specificities”** , as determined by a study by the University of Cambridge. Cornell University in the United States analysed 2.4 million online publications by the Russian state-owned media RT between 2006 and 2023 and concluded that **Russian propaganda involves strategic geographical segmentation**.
Fact-checked content about soldiers. Source: StopFake.
From La Silla Vacía in Colombia, Santiago Amaya explains to Maldita.es that he has observed that in recent years, “every international narrative finds**a ‘Colombian angle’ to gain traction locally”** , either through misinformation about Colombia being “on Russia’s hostile list, fabricated statements by politicians, or supposed Colombian mercenaries in Ukraine”.
But disinformation**does not always have the same impact in all regions**. Daisuke Furuta, editor-in-chief of Japan Fact-Check Centre (Japan), explains to _Maldita.es_ that **“Japanese society shows less interest in the situation in Ukraine”** compared to Europe. **In Turkey, “public attention and the flow of disinformation have largely shifted** to the conflict between Israel and Palestine and other emerging issues”, says Ali Osman Arabaci (Teyit). In both countries, say the two professionals, **disinformation about Ukraine has decreased considerably** in recent months.
In **Azerbaijan** , “the population's attention has also shifted**from the war between Russia and Ukraine to a closer neighbour: Iran** ”, explains Sabina Amrahova, editor of Teyit in this country. In this Asian state, pro-Ukrainian narratives were “much more frequent and viral than Russian disinformation among the general public” during the first months after the invasion, Amrahova says.
## Disinformation against Volodymyr Zelensky and his circle: a constant throughout the conflict
The President of Ukraine has been one of the main targets of disinformation throughout the conflict. Since the beginning of the invasion, **hoaxes and disinformation****have been circulating, describing Zelensky as a “Nazi”, a “drunk”, a “cocaine addict”” or someone who has “run away”** from the country. In this way, Russian propaganda has tried to **discredit his image and make him look like someone who is able to run the country**.
Disinformation about the President of Ukraine. Source: _Maldita.es_.
One of the most recurrent disinformation campaigns related to Zelensky focuses on the **Ukrainian president's supposed corruption**. Between June 2023 and December 2025, dozens of disinformation stories have been spread **accusing the Ukrainian president of****buying luxury properties around the world** with money that the country has received to fight Russian troops. According to the disinformation that has circulated (most of it without evidence), he has spent**millions of euros on luxury apartments, villas and other exclusive properties**. One of the most viral pieces of content accused Zelensky of buying a mansion on the outskirts of Berlin (Germany) known as **Villa Bogensee, linked to Joseph Goebbels** , Nazi propaganda minister during the Third Reich. This disinformation circulated in at least six languages, according to analysis by the AFP agency.
Disinformation attributing the purchase of luxury properties to Zelensky. Source: _Maldita.es_.
This type of content is part of a cross-border disinformation campaign claiming that **Zelensky and those around him****are wasting the money that the West is sending** to the country for the war. They are falsely accused of spending these resources on luxury goods and properties such as high-end cars, jewellery and yachts.
Several narratives attempted to discredit President Zelensky by spreading false claims, including accusations of corruption, Nazism, or fabricated videos showing him involved in violence or illegal actions.
— Filipe Pardal (Polígrafo)
## New resources to amplify the impact of disinformation: when AI enters the battlefield
“By 2024, Russian information operations had integrated **emerging technologies to improve the credibility and reach of false content** ”, according to an analysis by Kharkiv University in Ukraine on Russian disinformation strategies used in this conflict. Disinformation has become more sophisticated due to the emergence of **artificial intelligence (AI)** as a tool.
“AI is significantly accelerating and professionalising disinformation related to the war in Ukraine by **enabling faster content production and large-scale multilingual translations** ”, Ani Grigoryan, head of the verification unit at the Armenian media outlet CivilNet, explains to _Maldita.es_. According to Alex Zamkovoi, a fact-checker at StopFake (Ukraine), this technology allows disinformers to**“clone voices, fake videos and imitate real media brands”**.
At the end of 2025, this technology was used to create **content about the war in Ukraine****featuring soldiers** : wounded, surrendering to the Russian army, forced to go to the front, or showing regret for having enlisted as volunteers. One of the most viral examples was a video of a supposed soldier with a Ukrainian flag on his arm crying for help to avoid going to the front. It was**an AI-generated video that****spread on Twitter (now X) in 13 languages**. This trend has continued in the first months of 2026.
“Over the past year, content created with artificial intelligence has been added to show Ukrainian soldiers crying on the front line or **criticising Ukrainian MPs** ”, says Andrea Zitelli, editor of the Italian fact-checking site Facta. In February 2026, StopFake explained that videos of fake military personnel arguing with supposed politicians created with this technology were being posted on TikTok. They analysed **14 posts that, together, had more than 3.7 million views** on the platform.
Content about soldiers in the context of the war in Ukraine generated using artificial intelligence (AI). Source: _Maldita.es_ _._
This type of content, according to Santiago Amaya, a journalist at La Silla Vacía (Colombia), “can be produced in unlimited quantities and is **optimised to achieve maximum emotional impact** ” on users. In Amaya's words, we have gone from recycling old content to creating “totally fictional material that has never existed”. However, according to Ani Grigoryan from Armenia, **AI is not being used to invent new narratives** , but to make existing ones “more scalable, adaptable, localised and difficult to detect” with traditional verification tools.
AI facilitates the fast production of large amounts of content and its widespread dissemination, making it much more difficult for people to distinguish between what is true and what is not.
— Filipe Pardal (Polígrafo)
**“Russia] has systematically adapted its disinformation strategies** to target both domestic and international audiences”, according to [the study by the University of Kharkiv (Ukraine). This evolution, the document explains, reflects a shift “from traditional Soviet-era propaganda techniques to **sophisticated, multi-layered and technologically integrated operations** ” that aim to “shape perceptions, undermine Ukrainian resilience and fracture Western unity”.
## State agencies, paid influencers and Russian embassies spread disinformation campaigns and foreign interference
During the first half of 2022, Russian state agencies, such as RT, spread **messages and speeches****denying Ukrainian culture and nationhood and labelling Ukrainians as “Nazis”, “orcs” or “worms”**. They also referred to the country as ‘a cancerous growth that must be removed,’ although before 24 February 2022, they denied that Russia was going to invade Ukraine.
One of the tactics employed by the Kremlin to avoid the European Union sanctions imposed after the invasion on state broadcasters RT-Russia Today and Sputnik, allowing them to continue spreading disinformation, was to launch **an ”army” of communicators** on social media. They developed “**micro-influencer operations, which circumvented platform moderation** more effectively than propaganda”, explains the study by the University of Kharkiv.
At the end of 2022, French media La Marianne reported that a Russian company had contacted several YouTubers in the country “to spread Kremlin propaganda about the war in Ukraine”. Just over two years later, in September 2024, YouTube shut down the channel of Tent Media, a political influencer agency, after the US government accused it of being funded by Russia and spreading “propaganda and disinformation”. This model of foreign information manipulation and interference (or FIMI) has been used before by other countries such as China.
**Russian embassies in various countries** have also played an important role in spreading disinformation since the start of the conflict through their activity on social media. For example, in March 2022, the **Russian embassy in Spain** shared three images on its X account of supposed Russian military personnel providing “humanitarian aid to Ukrainian civilians” (one of these images was old); and several images of an injured woman saying that she was “very similar” to “an employee of the Ukrainian special internal organs unit”. In 2024, the **Russian Embassy in South Africa** spread the hoax that Zelensky had bought a house owned by King Charles III of England.
Publications from the accounts of Russian embassies in South America and Spain. Source: X.
More recently, several Russian embassies have spoken out **against the Kiev government's recruitment efforts**. “We regret that the number of Colombians who believe the false promises of Ukrainian recruiters remains quite high”, wrote**the Russian Embassy in Colombia** in a tweet posted in October 2025. In February 2026, the **Russian Embassy in Argentina** wrote on its Telegram channel and on X, following the broadcast of a documentary on the TN channel: “We were concerned to learn of the broadcast of a documentary produced by TN, which tells the story of adventurers recruited to fight for the Zelensky regime and, in essence, contains blatant propaganda for mercenaryism. We strongly condemn the publication of this material, which contradicts the traditionally friendly nature of relations between Russia and Argentina”. In this way, Russia continues to attempt to manipulate public opinion in other countries about what is happening in Ukraine four years after that 24th of February 2022.
_Contributors to this article include: Alex Zamkovoi from StopFake (Ukraine); Ali Osman Arabaci from Teyit (Turkey); Andrea Zitelli from Facta (Italy); Ani Grigoryan from CivilNet (Armenia); Daisuke Furuta from Japan Fact-Check Centre (Japan); Filipe Pardal from Polígrafo (Portugal); Sabina Amrahova from Teyit (Azerbaijan); and Santiago Amaya from La Silla Vacía (Colombia)._