I’ve often wondered how Russian officials can spew complete falsities with a seemingly straight face, denying basic facts even in light of irrefutable evidence, while blaming the West for whatever misdeeds Moscow itself commits.
From Foreign Ministry Spokesperson Maria Zakharova’s continued stream of falsehoods, denying Russia’s culpability for the global food crisis—despite the fact that it is Russia blocking the Black Sea ports in Ukraine and preventing grain from being shipped—denying that Russia bombed a maternity hospital in Mariupol, claiming the photos and overwhelming evidence of Russia’s atrocities is “information terrorism,” and other outright snide, rude prevarications…
To Foreign Minister Lavrov’s outright denial that Russia attacked Ukraine…
To the spokesman of Russia’s Defense Ministry, claiming at a recent press briefing that Russia had carried out a “high precision air attack at hangars where armaments and munitions were stored” that were allegedly delivered by the US and European countries at the Kremenchuk road machinery plant, a few meters north of the Amstor shopping center…
To Russia’s Deputy Permanent Representative to the UN, Dmitry Polansky, claiming that the shopping mall bombing was “provocation” by Ukraine “a la Bucha,” which is a remarkable self-own, given the reams of evidence that exist about Russia’s atrocities there…
To Russia’s initial, absurd claim about “denazification” of Ukraine…
To Kremlin spokesman Peskov’s ridiculous denial about Russia’s default on its debt a few days ago…
The world I know seemed to watch these claims agog at the sheer mendacity of Russia’s messaging and incredulous that anyone would actually believe them.
But apparently, people do.
A recent study by Microsoft revealed that Russia is deploying new technologies and tactics to support its cyber influence operations in an effort to increase support for its invasion of Ukraine. The strategies and tactics give Russian foreign influence operations a “broader geographic reach, higher volume, more precise targeting, and far greater speed and agility.”
And it appears that the Russian information operations have been more successful than one would have thought, according to Microsoft.
As part of a new initiative at Microsoft, we are using AI, new analytics tools, broader data sets, and a growing staff of experts to track and forecast this cyber threat. Using these new capabilities, we estimate that Russian cyber influence operations successfully increased the spread of Russian propaganda after the war began by 216 percent in Ukraine and 82 percent in the United States.
To be sure, Russian propaganda is nothing new. The Soviets were expert at employing “active measures”—anything from disinformation campaigns to out right insurrections.
Aktivnye meropriyatiya, “active measures,” was a term used by the Soviet Union (USSR) from the 1950s onward to describe a gamut of covert and deniable political influence and subversion operations, including (but not limited to) the establishment of front organizations, the backing of friendly political movements, the orchestration of domestic unrest and the spread of disinformation. (Indeed, the Committee for State Security [KGB]’s Service A, its primary active measures department, was originally Service D, meaning disinformation.)
Microsoft has found a sharp increase in traffic to Russian propaganda websites early this year, and despite efforts to reduce traffic from Russian state propaganda sites such as Sputnik and RT, traffic is still higher than it was before Russia’s invasion of Ukraine.
Of course, just because people visit the site, does not mean they believe the disinformation posted on it. However, Russia’s disinformation efforts during the Cold War were considered pretty effective, and according to the Center for Strategic and International Studies, current Russian disinformation efforts are even more successful thanks to new technologies, such as social media. Russia has learned to grab the narrative early and flood target populations with an enormous amount of propaganda.
The Rand Corporation a few years ago described Russian disinformation campaigns as a “firehose of falsehood” meant to inundate the psyche with lies, so the barrage embeds itself as the truth because fatigue sets in, and the audience can no longer distinguish truth from fiction.
But again, consumption does not always mean acceptance, and a recent study to determine whether Kremlin disinformation is effective, shows varied results, depending on the topic of the propaganda.
We find that the average respondent is usually able to distinguish between verifiably true news stories and pro-Kremlin disinformation claims. However, for one topic area, economics, subjects are less able to distinguish between true stories and disinformation, and for the other three topics respondents report low levels of certainty about the truth propositions of true stories. We do not find significant differences in respondents’ belief in stories across strategies of disinformation.
In addition, national identity and level of political savvy matter.
In addition to examining the relationship between topic, strategy, and belief, we also examine respondent-specific correlates of belief in disinformation. We find that those who have partisan or contextual connections with Russia, such as supporting a political party that advocates closer relations with Russia, preferring to communicate in Russian instead of Ukrainian, and claiming Russian ethnicity, are more susceptible to believing pro-Kremlin disinformation. In line with previous literature (e.g., Pereira et al., 2018), we also find that those who are more politically sophisticated are less likely to believe Russian disinformation.
So who are the consumers of mass media who believe Russian claims that the Ukraine is planning a chemical attack against Russia without any evidence?
Who are the audiences who believe and amplify Russian claims that the massacre in Bucha was committed by Ukrainian provocateurs?
Who are the individuals who believe conspiracy theories about the United States concealing dangerous chemical weapons in Ukraine?
Who believes that Ukraine—a nation led by a Jewish individual—is full of Nazis, who really run the government?
From my personal experiences on various social media platforms, those who believe Russian disinformation tend to be those who distrust the US government and those who distrust the mainstream media, seeking out “alternative sources.” And if a bit of information is posted and is not corroborated by any other source, that gives the information more credibility, not less, because a lack of corroboration means the mainstream media is hiding something, or carrying water for those in power. Time and time again, I see Russian propaganda disseminated by individuals who aim to discredit current government officials.
In addition, it’s the Russians themselves who believe and disseminate Kremlin’s disinformation. Moscow has clamped down on the free press, has shut down or named independent media outlets as foreign agents, and has imprisoned journalists for daring to deviate from the party line.
Yes, Russians still have VPN options to access western media reports, but many of them do not. Culturally, many of them still cling to the news sources to which they are accustomed.
And it’s not just in Russia itself. Having come from the Russian-speaking community in the United States, where my parents still are rooted, I see the same trends. The Russian speakers who came here from the “old country” tend to be much more skeptical of the mainstream media reports about Russia’s invasion of Ukraine. They tend to respond to reports about Russian atrocities with “Well, I don’t know…” even as they claim to despise Putin. Perhaps it is because the males have all served in the Russian armed forces as conscripts, and having that personal connection, they don’t want to believe that the force in which they served is capable of horrible war crimes. Or perhaps it is because they learned not to trust anything mainstream, because in Russia it’s controlled by the government, so the assumption is that the same thing is happening in the United States.
The challenge is that disinformation is difficult to correct. The Internet is forever, and once a piece of propaganda takes hold, no amount of fact-checking will destroy it, especially if those propagating it have an emotional connection to the claim or if the claim confirms their existing biases.
Research confirms that preempting Russian propaganda works well to put the Kremlin on the defensive, which is part of the reason why the Biden administration has declassified a lot of intelligence, giving the public an unprecedented look into the data that informs policymaking.
Researching online claims and looking for primary sources is another strategy that works well to counter Russian propaganda. I will never share a political meme—regardless from what side it originates—and the stubborn part of me (read: all of me) will go searching for the truth and for corroborating evidence.
Recognizing the red flags of Russian disinformation is critical. On Twitter, an account name consisting of a noun followed by a series of random numbers often screams BOT! Numerous hashtags unrelated to the subject matter—as if the account is desperate for attention—is also a good indicator that the post may contain propaganda, disinformation, or just plain ole spam. If it fits too neatly into a preconceived narrative aimed at confirming existing biases, chances are it’s probably disinformation—especially if the information is aimed at a particular group.
I also run avatars and other social media photos through reverse image searches, such as tineye.com to ensure the author and the organization are legitimate.
Language gives you a clue as well. The use of words meant to evoke emotions, such as “horrifying,” “bloody,” or “outrageous,” calling for organized action, combined with other red flags, could also indicate Russian disinformation.
Also look at who shares the information. Is it someone known for sharing propaganda in the past? Can the propaganda be traced to a legitimate source, or is the source owned or controlled by a Russian entity that has connections to government officials or kleptocrats?
I have found that the most effective Russian propaganda is subtle, employing several of the above techniques, meant to foment outrage and aimed at specific target audiences.
We can’t 100 percent avoid it, but we can at least work to recognize it.