The Federal Communications Commission ruled on Thursday that robocalls using voices generated by artificial intelligence are illegal, amid concerns over how the cutting-edge technology is being used to scam people and deceive voters.
"Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters," FCC chairwoman Jessica Rosenworcel said in a statement. "We're putting the fraudsters behind these robocalls on notice."
Last month in New Hampshire, a robocall using an apparently AI-generated voice depicting President Biden discouraging Democrats from voting reached thousands of voters just days before the state's primary.
New Hampshire's attorney general said this week a Texas telemarketer was behind the call, and that another Texas-based company transmitted it. He's opened an investigation into illegal voter suppression.
AI has also been used to extort money from families by mimicking the voice of a loved one in danger. Last year the Federal Trade Commission warned consumers those scams are on the rise.
Rapidly advancing technology has led to the wide proliferation of tools that can easily generate realistic audio, video, and images. That's raised fears over how the technology can be abused to dupe people and create plausible-seeming evidence of events that never happened.
The FCC's ruling deemed calls made with AI-generated voices "artificial" under a 1991 federal law aimed at curbing junk calls.
It means the FCC can fine violators and block the telephone companies that carry the calls. In addition, the ruling lets victims sue robocallers that use AI, and gives state attorneys general additional tools to prosecute bad actors.
300x250 Ad
300x250 Ad