Last week, a video began circulating on X, formerly Twitter, purporting to show a person in Pennsylvania ripping up ballots marked for former President Donald Trump and leaving alone those marked for Vice President Harris.
The person curses the former president multiple times and at one point says, “Vote Harris.”
The video is a fake. The envelopes and ballots shown don’t match what that county actually uses to vote. U.S. officials said it was created and spread by Russia to sow doubt in the election.
But the incident showed what has been clear for some time now: Online in 2024, the deck is stacked against voting officials, maybe even more so than in 2020. The phony video was viewed hundreds of thousands of times shortly after it was posted. A statement from Bucks County debunking it three hours later was shared on X fewer than 100 times.
“They're fighting an uphill battle,” said Darren Linvill, co-director of Clemson University’s Media Forensics Hub, which tracks election influence campaigns. “I'm sure that they often feel like they're trying to put their finger in the dike before it bursts.”
Linvill traced the video back to a Russian propaganda operation, first identified by Clemson, that has also spread faked videos targeting Harris and her running mate Tim Walz in recent weeks.
With less than a week left of voting, the election cycle has entered a fraught stage in which rumors, misleading claims and conspiracy theories are surging. And election administrators, intelligence officials and researchers don’t expect that to end when polls close. They are bracing for what is expected to be a contentious period of counting and certifying votes, in which discord fueled by foreign and domestic sources could be corrosive to democracy.
Perhaps the biggest factor is former President Donald Trump, who continues to falsely assert he won the 2020 election, despite courts and investigations finding no evidence of fraud. He has already set the stage to reject the results should he lose again this year.
“If I lose — I’ll tell you what, it’s possible. Because they cheat. That’s the only way we’re gonna lose, because they cheat,” Trump said at a September rally in Michigan.
Despite the lack of evidence, his claims have been embraced by many Republicans and eroded confidence in voting among a wide swath of Americans.
“We’ve already set the standard that you are allowed to doubt the results on Election Day,” said Linvill. “And that just doesn’t bode well.”
America’s geopolitical adversaries — particularly Russia, Iran and China — are expected to seize on election fraud claims, however unfounded, and generate their own material undermining the results, as part of their larger goals to sow chaos and discredit democracy.
Officials charged with safeguarding voting say they’ve learned and are applying many lessons from 2016 and 2020 — but are also confronting a new set of challenges this year, including advances in artificial intelligence that make it easier and cheaper to generate realistic but fake images, video, audio and text, and the emergence of X owner Elon Musk as a leading Trump surrogate, donor and amplifier of election fraud conspiracy theories.
“Going into the 2024 election cycle, we are arguably facing the most complex threat environment compared to a prior cycle,” said Cait Conley, who oversees election security efforts within the Department of Homeland Security’s cyber agency, in an interview with NPR.
Foreign powers seize on divisive issues
To meet that heightened risk, government officials are counting on transparency and warnings to help Americans gird themselves against manipulation.
Federal intelligence and law enforcement officials are taking a more aggressive approach this year in calling out foreign meddling. It’s a stark difference from 2016, when the Obama administration was reticent to make public information about the full scope of Russia’s efforts favoring Trump until after the election.
This year, Russia is angling to boost Trump, as it did in the previous two presidential elections, while Iran is trying to undermine the former president, the intelligence community and private sector researchers say. China is targeting down-ballot races but does not appear to have a preference in the presidential race. All three regularly seize on divisive issues, from immigration to abortion to Israel’s war in Gaza, to exacerbate discord among Americans. And they’ve all experimented with using AI to churn out more misleading content.
“Our adversaries know that misinformation and disinformation is cheap and effective,” said Sen. Mark Warner, D-Va., who chairs the Senate intelligence committee, in an interview with NPR.
Loading...
The federal government moved quickly to publicly attribute the fake Pennsylvania ballot video to Russia the day after the video first appeared on X — a notably rapid turnaround for intelligence and law enforcement officials. And they warned they expect more such fakes in the coming days and weeks.
In September, the Justice Department seized web domains it says Russian operatives used to spoof American news outlets and spread fake stories, indicted employees of Kremlin-backed broadcaster RT in a scheme to fund right-wing pro-Trump American influencers, and brought criminal charges against Iranian hackers accused of targeting the Trump campaign.
DHS’s Cybersecurity and Infrastructure Security Agency and the FBI have issued joint public service announcements alerting Americans to tactics foreign actors might use to discredit the election, including ransomware attacks or falsely claiming to have hacked voter registration systems.
Officials say, so far, there’s no sign that foreign adversaries have breached any election or voting systems. But attackers don’t have to succeed in order to undermine confidence, Conley, the election security expert, said.
“While these things like a [distributed denial of service] attack or ransomware could be disruptive to the elections process, it's not going to undermine the security or integrity of the vote casting or counting process,” Conley said. “But our adversaries may try to convince the American people otherwise.”
At all levels of government, the message is consistent: Turn to the people running local elections for authoritative voting information.
“What we are really trying to encourage them to do is to know that it is your state and local election official that is the signal through that noise,” Conley said.
And local election officials are making more of an effort than ever before to seek out media attention and educate the public on their processes. State election officials in a number of swing states have started holding multiple press conferences per week leading up to Nov. 5.
“It's really important for us to get the message out there first and be as proactive as possible,” said Isaac Cramer, who runs elections in Charleston County, S.C.
Partisan and splintered realities
The very subject of election “disinformation” itself has been turned into a partisan fight. A coordinated Republican legal and political campaign to cast efforts to mitigate or track the spread of falsehoods online as “censorship” has undercut the work of government agencies, online platforms and researchers, and driven one institution out of the field.
Last month, Warner wrote an open letter urging CISA to do more to help state and local governments identify and respond to election misinformation and disinformation campaigns, and to coordinate communications between the government, tech companies and researchers.
“The government needs to get this information out as quickly as possible, because literally the stakes are nothing less than our democracy,” Warner told NPR.
Cramer, the election official in South Carolina, said one challenge for local governments dealing with false information online is how splintered the internet is. He’s recently started seeing a lot more wrong voting information on NextDoor, for instance.
“We can't possibly have eyes on every platform and see everything that is being posted,” Cramer said.
Increasingly election officials are thinking outside the box to reach voters, because trying to fight fire with fire on social media has felt like a losing battle for years now, says Carolina Lopez, a former election official from Miami-Dade County, Fla.
“Election officers around the country spend a whole lot of time producing content for social media, and it always kills me when I see, like, three likes and it's usually themselves, their [spokesperson] and their mom,” said Lopez, who now runs the Partnership for Large Election Jurisdictions. “Election officials are trying to figure out, ‘Well, what else can I do to be heard?’”
In Montgomery County, Pa., Neil Makhija fashioned a voting ice cream truck to travel his county and help people vote. Cramer, in Charleston County, co-wrote a children’s book. Derek Bowens, in Durham County, N.C., created an app that could deliver accurate election information directly to people there.
The world’s richest man’s megaphone
One of the loudest voices elevating unverified rumors and outright falsehoods about the 2024 election also controls a major communications platform. Musk, the world’s richest man, took control of Twitter two years ago and has remade the site, now called X, into a pro-Trump megaphone.
Musk has become a major vector for baseless claims that Democrats are bringing in immigrants to illegally vote for them — a conspiracy theory Trump and other Republicans have embraced and are using to lay the groundwork to claim the election was stolen should he lose.
When election officials try to correct Musk’s false claims, he has lashed out. Michigan Secretary of State Jocelyn Benson told CBS News that she and her staff received threats and harassing messages after Musk called her a liar when she fact-checked his claim that the state has more registered voters than eligible citizens.
Musk has also shared AI-generated content on X without disclosure, including images and videos of Harris doing and saying things she didn’t.
Musk’s America PAC is inviting users to share “potential incidents of voter fraud or irregularities” on an “Election Integrity Community” on X that has more than 40,000 members. The feed is filled with allegations that voting machines are switching votes from Trump to Harris and posts casting doubt on the security of mail-in ballots — both frequently debunked narratives in 2020 — as well as copies of the fake Bucks County video.
Danielle Lee Tomson, research manager at the University of Washington’s Center for an Informed Public, said such “evidence generation infrastructure” is more robust this year. Even when these efforts identify real issues with voting, they tend to ignore the checks in the system that catch problems, she said.
“If you see something seemingly suspicious, and then you take a picture of it and post it online, that can be decontextualized so quickly and not take into account all of the various remedies or the fact that there's nothing suspicious there at all,” she said.
Local election officials are making their processes more transparent than ever this year, including livestreaming counting facilities, and welcoming record numbers of election monitors. But such openness is a double-edged sword: Video feeds provide more material for content creators who may use it to push their own narratives of malfeasance — such as the false claims against Georgia election workers amplified by Trump in 2020. That leaves officials operating with the knowledge that their every move will be scrutinized.
“We try to not commit unforced errors,” said Stephen Richer, the Republican recorder in Maricopa County, Ariz., who has been an outspoken debunker of election lies. “But at the end of the day, if somebody really wants to make something look weird, I think they can do it, unfortunately.”
Social media steps back
In 2020, major social media platforms proactively boosted election officials as authoritative sources of information, made misleading posts about voting less visible, and added warning labels to false claims. Now, Musk has cut most of X’s team policing the platform and removed many guardrails against false and misleading content.
X is the most glaring example, but other platforms have also backed away from the more aggressive stance they took in 2020, cut back on the number of people working on trust and safety, and are generally more quiet about the work they are doing. Meta now lets Facebook and Instagram users opt out of some features of its fact-checking program, while its text-based social network, Threads, has deemphasized news and politics.
Warner told NPR he’s concerned social media platforms have stepped back at a time when threats, including from AI-generated content, are more urgent.
“Think about the devastating effect it’d have if somebody uses an AI image of what looks like an election official somehow destroying ballots or, you know, breaking into a drop box,” he said. “That kind of imagery could literally spark violence in a close election after the fact and again undermine Americans’ confidence in our system.”
In the face of that landscape, election officials say they are controlling what they can control. They have spent countless hours reaching out to skeptical voters over the past four years, and they’re now clinging onto hope that work will make a difference in people's willingness to accept election results.
Michael Adams, the Republican secretary of state of Kentucky, is hoping the novelty of election denial will start wearing off as well.
“For a while there, every six months, they'd come up with a new conspiracy theory. It would be debunked. They'd have egg on their face. They go back in their hole for six months and then come back,” Adams said. “You only get so many bites at that apple.”
300x250 Ad
300x250 Ad