AI-Generated Biden Voice Used in Deceptive Robocalls Leads to $1 Million Fine

A company that used artificial intelligence to mimic President Biden’s voice in deceptive robocalls to New Hampshire voters has agreed to pay a $1 million fine. The Federal Communications Commission (FCC) took action against Lingo Telecom, the voice service provider, highlighting concerns about the potential misuse of AI in influencing elections. The case underscores the growing threat of deepfakes to democracy and the importance of verifying information online.

Political Consultant Indicted for Fake Robocall Impersonating Biden in Presidential Primary

Steven Kramer, a Democratic political consultant, has been indicted on 13 charges of felony voter suppression and 13 misdemeanor impersonation of a candidate charges for allegedly sending thousands of robocalls imitating President Biden and urging New Hampshire residents not to vote in the Democratic primary. The Federal Communications Commission (FCC) has also proposed fining Kramer $6 million for using an AI-generated deepfake audio recording of Biden’s voice. The FCC also proposed to fine Lingo Telecom $2 million for allegedly transmitting the robocalls.

Scroll to Top