A Ukrainian serviceman walks through Mariupol.
Even before Russia launched its assault on Ukraine, false claims and misleading rumours had been circulating online.
But as Russian forces have advanced further into Ukraine, the amount of online misinformation has skyrocketed.
Both Moscow and Kyiv have been guilty of spreading misinformation amid the online information war of propaganda. But individual social media users have also fallen victim to false rumours and amplified unfounded allegations.
In times of conflict and crisis, when people are hungry for details about the war in Ukraine, misinformation can be equally as viral as verified facts.
Here is a selection of some of the false claims that had been widely circulating and have since been debunked by fact-checkers.
These videos do not show Ukrainian cities being shelled
The content of the article:
Within hours of Russia launching its invasion, misleading videos of unrelated explosions had been seen by thousands of people.
One of the first videos that appeared on Twitter under the hashtag #нетвойне (#NoWar) falsely claimed to show a powerful blast in a Ukrainian city. The video has received more than 112,000 views.
But the footage actually dates from August 2015 and shows a deadly explosion at a storage facility in Tianjin, China.
Another misleading video — shared widely on Facebook, Instagram, and TikTok — shows footage of the fatal explosion at Beirut port in August 2020.
Users had falsely claimed that the video showed “Ukrainian headquarters” being bombed by Russian forces.
Misleading videos from Tianjin (L) and Beirut (R) claimed to show explosions in Ukraine.Euronews via Twitter/Facebook
Videos of the deadly blasts in both Tianjin and Beirut have regularly been shared as misinformation during other explosions and will likely also be shared in the future.
Neither are related in any way to the Russian invasion of Ukraine.
Their virality is proof that misleading videos related to the war in Ukraine are being taken out of context not just from previous conflicts or military exercises, but from other historical, global events.
Images and videos are often sources of misinformation because they will catch a person’s eye and draw their attention more than a worded social media post.
The war in Ukraine is not a hoax
The European Parliament on Wednesday called for more measures to curb Russian disinformation — deliberately false claims — about the invasion of Ukraine.
Most propaganda reiterates the Kremlin stance that the invasion of Ukraine is a “special military operation” to supposedly “denazify” a “failed state”.
But other viral posts in Russia have falsely claimed that there is no such conflict at all and that Western sources are creating a “hoax” conflict.
One video of a news reporter standing in front of body bags was widely shared by pro-Kremlin accounts. In the footage, one of the body bags starts moving.
Users falsely claimed that the video showed proof that casualties were being invented by “western propaganda”.
But the video was not filmed in Ukraine, and actually shows a “Fridays for Future” climate change protest in Vienna, over a month ago. Activists at the rally had used body bags to illustrate future potential deaths due to Austria’s CO2 emissions.
Some online users had even manipulated the audio of the clip or added graphics to falsely claim it was filmed in Ukraine, but open-source tools show that the video was posted on YouTube by the Austrian news channel OE24 in February.
A video from a climate protest in Austria was a source of misinformation on the war in Ukraine.Euronews via Twitter
Other examples of misinformation have centred on “crisis actors” — people who are supposedly hired to act out scenes from an attack.
Pro-conspiracy users have shared false claims that Ukrainian civilian citizens were staging their fear or injuries from genuine shelling incidents.
An example of this was a video of a woman applying blood-style makeup to the face of another man. The incident was not filmed amid the ongoing war in Ukraine but instead from the set of a television series called “Contamin” in 2020.
The most high-profile of these “crisis actors” rumours was shared after a deadly attack on a maternity hospital in the city of Mariupol on 9 March.
Users had falsely claimed that the hospital was non-operational and that a woman had been “hired” to play the role of two pregnant women who were filmed in the aftermath of the attack.
The claim was amplified on social media by Russia’s embassy in the United Kingdom, but their misleading posts were soon removed by both Facebook and Twitter.
On Monday, one of the pregnant women who the embassy claimed was a “crisis actor” died, as well as her unborn child.
Despite this, Russian ambassadors and embassies are still deliberately sharing the false claim that actors and not victims were pictured after the maternity hospital bombing in Mariupol.
Social media platforms are continuing to try and remove or label misleading content about “crisis actors”.
No evidence of the ‘Ghost of Kyiv’
Since the war in Ukraine began, stories of bravery have circulated, but not all folk stories have been verified.
One famous example concerns the so-called “Ghost of Kyiv”. This individual is rumoured to have single-handedly brought down six Russian planes at the very start of the invasion.
The Ukrainian military said on 24 February that five Russian planes and a Russian helicopter were shot down in the Luhansk region, but the report that a single Ukrainian pilot downed the aircraft is however unsubstantiated.
One widely shared video clip falsely claimed to show the “Ghost of Kyiv” in action over Kyiv but the footage was actually from the video game “Digital Combat Simulator (DCS) World”.
TikTok videos with the hashtag #ghostofkyiv reached 200 million views, while unverified rumours of the “Ghost” have been amplified by senior Ukrainian figures.
The country’s former President Petro Poroshenko supposedly identified the “Ghost of Kyiv” in a tweet, but the photo was instead a 2019 image showing Ukrainian pilots testing new French helmets, as seen below.
The image shared by Petro Poroshenko was first posted on Twitter in April 2019.Euronews via Twitter
The rumour of the “Ghost of Kyiv” is certainly not an isolated case, with other unverified claims positing that a local cat — the so-called “Panther of Kharkiv” — supposedly working alongside Ukrainian soldiers to detect Russian snipers. The author of the post has since acknowledged it as a joke.
Although unverified, uplifting stories like these can potentially offer hope to Ukrainian citizens during wartime.
But according to some analysts, fantastical and false claims of Ukrainian success can harm the country, if there is not an accurate picture of the realities of war.
Misleading folk stories may even draw attention away from genuine acts of heroism by Ukraine’s military and population.
From verified videos, it is clear to see that both the Ukrainian military and ordinary civilians are putting up a fierce fight against the Russian invasion.
Ukrainian President Volodymyr Zelenskyy has himself also used his own social media accounts to share video updates and debunk reports that he had left Kyiv.
But he also handed out honours to 13 border guards who were mistakenly believed to have been killed while defending Snake Island. Ukraine’s Navy later stated that the men were “alive and well” and had been captured by Russia.
Both Ukrainian and Russian officials have been guilty of sharing misinformation during the war — whether intentional or not — most of which can be fact-checked using widely available open-source tools.
As fighting on the ground intensifies, both sides have also redoubled efforts to control the narrative online.