A deepfake video of a State Department official falsely claiming a Russian city is a legitimate target for Ukrainian strikes using U.S. weapons.
Pro-Russia social media accounts amplifying stories about divisive political topics such as immigration and campus protests over the war in Gaza.
Sham news sites spoofing real publications or posing as legitimate-sounding outlets with names like D.C. Weekly, the Boston Times and Election Watch.
Russian propaganda is ramping up in a busy global election year, targeting American voters as well as elections in Europe and the Paris Olympics, according to intelligence officials, internet researchers and tech companies.
“Russia remains the most active foreign threat to our elections,” Director of National Intelligence Avril Haines told senators last month at a briefing about election risks.
Influence operations linked to Russia take aim at a disparate range of targets and subjects around the world. But their hallmarks are consistent: attempting to erode support for Ukraine, discrediting democratic institutions and officials, seizing on existing political divides and harnessing new artificial intelligence tools.
"They're often producing narratives that feel like they're throwing spaghetti at a wall," said Andy Carvin, managing editor at the Atlantic Council's Digital Forensic Research Lab, which tracks online information operations. "If they can get more people on the internet arguing with each other or trusting each other less, then in some ways their job is done."
Some efforts have been linked directly to the Kremlin, including a network of fake accounts and phony news websites given the name Doppelganger, whose operators have been sanctioned by both the U.S. and the European Union.
The origins of others are still unknown, such as the fabricated video of State Department spokesman Matthew Miller, in which reporters' questions and Miller's response about U.S. policy in the Ukraine war were faked, likely with the help of artificial intelligence. The video circulated on Russian Telegram channels and was picked up by Russian state media and government officials, according to The New York Times.
Russia employed tactics such as exacerbating existing divisive issues and creating fake accounts posing as Americans in its 2016 and 2020 election-meddling efforts, researchers say. Since Russia invaded Ukraine in early 2022, discrediting Ukraine and amplifying voices in the U.S. and other countries that oppose aid to Ukraine and support for NATO has become a dominant theme of the Kremlin's efforts.
"What you can see is they are referencing politics in a certain country, and they are generally tying that to what is going on in Ukraine. The underlying message is, 'Here's why people should not support Ukraine,'" said Ben Nimmo, principal investigator on OpenAI’s intelligence and investigations team, who previously led global threat intelligence at Facebook's owner, Meta.
Fake accounts, phony websites
The Kremlin relies on what Haines called "a vast multimedia influence apparatus, which consists of its intelligence services, cyber-actors, state media proxies and social media trolls" to pump out propaganda, launder fake and misleading news articles and circulate conspiracy theories.
Since the invasion of Ukraine, the European Union has banned Russian media outlets including RT, Sputnik, Voice of Europe and RIA Novosti from publishing or broadcasting within the bloc. That hasn’t stopped RT articles from proliferating across hundreds of other websites widely available in Europe, according to a recent report from the German Marshall Fund of the United States, the University of Amsterdam and the Institute for Strategic Dialogue.
"We discovered RT articles reposted to third-party websites targeting audiences from Iraq to Ethiopia to New Zealand, often without any indication that the content was sourced from a Russian propaganda outlet," the researchers wrote.
Perhaps the most persistent and prevalent Russian online operation is Doppelganger. First identified by researchers at the EU DisinfoLab in 2022, the campaign has impersonated news outlets including the U.K.'s The Guardian, Germany's Der Spiegel, The Washington Post and Fox News, and it has posed as NATO, the Polish and Ukrainian governments, the German police and the French Foreign Ministry.
In addition to operating fake accounts and phony websites, the operation purchased Facebook ads targeting French and German audiences with messages about aid to Ukraine, farmers' protests and the war in Gaza, according to the European nonprofit AI Forensics.
Doppelganger has also set its sights on the Paris Olympics, Microsoft said in a report this week. It used fake French-language news sites to push claims of corruption in the Games' organizing body and to warn of potential violence.
In March, the U.S. Treasury sanctioned two Russian companies identified as being behind Doppelganger — Social Design Agency and Structura — as well as their founders, saying they carried out the campaign "at the direction of the Russian Presidential Administration."
The misinformation-tracking company NewsGuard has connected a separate network of 167 websites "masquerading as independent local news publishers in the U.S." to a former deputy sheriff from Florida who now lives in Moscow.
Using AI tools to create propaganda
The volume of posts, articles and websites that Russian-linked operations produce is being boosted by artificial intelligence — another new factor that sets 2024 apart from previous election cycles.
Covert influence campaigns based in Russia, as well as in China, Iran and Israel, have begun using AI in their attempts to manipulate public opinion and shape politics, according to recent reports from OpenAI, Meta and Microsoft.
A Russian operation that Microsoft calls Storm-1679 used AI to fake actor Tom Cruise's voice narrating a phony Netflix documentary disparaging the International Olympic Committee.
According to OpenAI, Doppelganger has used its AI tools, which include ChatGPT, to translate articles into other languages and generate social media posts and comments. Another Russian operation, dubbed Bad Grammar, used AI to debug code for a program that automatically posted on Telegram.
The question remains: How effective are Russia's attempts to influence public opinion and democratic elections?
Many online operations that have been publicly identified haven't reached large audiences of real people, researchers say, and AI hasn't made them any more convincing — at least not yet.
"It's absolutely true that when you look at an individual campaign, it's just as likely as not that it hasn't had a huge amount of influence, which is why Russia just does it again and again, or in a different form, or targeting a different group," the Digital Forensic Research Lab's Carvin said. "It's almost like producing cheaply manufactured goods and just getting it out there in the world, hoping that maybe one particular gadget ends up becoming the popular toy of the season, even if the others completely fail."
Many researchers who study disinformation warn against seeing the hand of Russia as an all-powerful puppeteer, especially since so much of what its mouthpieces amplify is homegrown.
"Any potential narrative that's being argued in a given political environment is fodder for Russian operations — which in itself can sound a little crazy and conspiracy-ish," Carvin said. "And in some ways you risk creating a … situation where absolutely everything that's happening online is all Russia's fault."
But, he added, "at the same time, Russia has a lot of resources at its disposal and it's willing to experiment in different ways to see which things stick. … Why not try all of the above and see where it takes you?"
Transcript
JUANA SUMMERS, HOST:
It is a busy election year around the globe. And as candidates are making their cases, Russian propaganda has been ramping up, from deepfake videos to phony websites and fake social media accounts. NPR's Shannon Bond covers propaganda and influence campaigns and is here in studio with me now. Hi, Shannon.
SHANNON BOND, BYLINE: Hi, Juana.
SUMMERS: So Shannon, Russia's election-meddling efforts in the 2016 and 2020 elections here in the United States - they've been very well documented. So tell me what's new this year.
BOND: Yeah. I mean, really, since 2016, Russia has not stopped. The director of National Intelligence recently said Russia remains the most active foreign threat to U.S. elections. And so many of these tactics we see, you know, are pretty consistent going back for years, like using networks of fake accounts posing as Americans and seizing on divisive social issues. You know, these are the same sort of tactics we're seeing around the upcoming elections in Europe as well.
But two things do stand out this year. So first is eroding support for Ukraine as a sort of overall goal and, more broadly, undermining democratic institutions. And the second is artificial intelligence. Russia and other countries are now using AI tools like ChatGPT and voice manipulation to aid in these efforts.
SUMMERS: OK, Shannon, I want to talk about both of these. Let's start out with the Ukraine propaganda. Give me an example of just what that looks like.
BOND: Yeah. So the most persistent and prolific operation tied to Russia is known as Doppelganger. It spoofs news and government websites in the U.S. and Europe. I spoke with Andy Carvin at the Atlantic Council's Digital Forensic Research Lab, and he says Doppelganger takes this very scattershot approach that's really a hallmark of Russian operations.
ANDY CARVIN: They're often producing narratives that feel like they're throwing spaghetti at a wall - and that it may not directly mention the U.S. election or the EU election or even reference what's going on in the war, but it's intended to stir the pot.
BOND: So Doppelganger has pushed fake stories accusing the Ukrainian government of corruption, but it's also grabbed onto, like, more local issues, like farmers protests in Germany, the Gaza war, even the Paris Olympics.
SUMMERS: Interesting. And you also mentioned earlier that AI is playing a role here. How are these propaganda campaigns using AI?
BOND: You know, people have been worried that we're going to see some kind of flood of AI deepfakes. That has not yet manifested, but we have seen some apparently AI-manipulated videos. A recent one purported to show a State Department official claiming a Russian city is a legitimate target for Ukrainian strikes. He never said that. That's just not true.
But what Russian-linked campaigns seem to be using AI for more frequently is to do things like churning out a lot of social media posts and comments in multiple languages, even to debug their software code. ChatGPT maker, OpenAI, and Meta, which owns Facebook, have banned some of these operations recently. But I should note, both companies said, you know, it doesn't seem that using this kind of AI technology is actually helping these operations be any more effective.
SUMMERS: Well, I mean, that's the big question, isn't it? And is any of this actually working and changing people's minds?
BOND: Yeah. I mean, that is exactly it. And operations like Doppelganger, they're really prolific, right? They pump out a lot of content, but they also tend to be pretty clumsy. These accounts are getting caught and removed. The links to their fake websites are being blocked before they can reach a lot of real people. So it's hard to say that these ones are gaining a lot of traction. But, you know, this is the kind of stuff that companies and researchers are catching, and we don't know what we don't know.
And it's also important to remember that social media campaigns are not the only tools Russians have at their disposal. You know, It has this whole apparatus. It includes state media outlets. It includes Russian intelligence agencies. The Washington Post recently reported Russia has been funneling money to far-right politicians in Europe. And so when we're thinking about the scope and impact of these operations, you know, we shouldn't overstate the threat, 'cause as we've seen, some of these don't work so well, but we shouldn't understate it either.
SUMMERS: That's NPR's Shannon Bond. Shannon, thank you.
BOND: Thanks for having me. Transcript provided by NPR, Copyright NPR.
300x250 Ad
300x250 Ad