With a Biden voice deepfake, AI wades into NH voter suppression
A robocall impersonating President Biden urged voters to skip the New Hampshire primary and 鈥渟ave your votes for November.鈥 Experts say AI deepfakes such as this one can blur the concept of truth, posing a deeper challenge to democracy.聽
A robocall impersonating President Biden urged voters to skip the New Hampshire primary and 鈥渟ave your votes for November.鈥 Experts say AI deepfakes such as this one can blur the concept of truth, posing a deeper challenge to democracy.聽
The New Hampshire attorney general鈥檚 office on Jan. 22 said it was investigating reports of an apparent robocall that used artificial intelligence to mimic President Joe Biden鈥檚 voice and discourage voters in the state from coming to the polls during the Jan. 23 primary election.
Attorney General John Formella said the recorded message, which was sent to multiple voters on Sunday, appears to be an illegal attempt to disrupt and suppress voting. He said voters 鈥渟hould disregard the contents of this message entirely.鈥
A recording of the call reviewed by The Associated Press generates a voice similar to Mr. Biden鈥檚 and employs his often-used phrase, 鈥淲hat a bunch of malarkey.鈥 It then tells the listener to 鈥渟ave your vote for the November election.鈥
鈥淰oting this Tuesday only enables the Republicans in their quest to elect Donald Trump again,鈥 the voice mimicking Mr. Biden says. 鈥淵our vote makes a difference in November, not this Tuesday.鈥
It is not true that voting in the Jan. 23 primary precludes voters from casting a ballot in November鈥檚 general election. Mr. Biden is not campaigning in New Hampshire and his name will not appear on Tuesday鈥檚 primary ballot after he elevated South Carolina to the lead-off position for the Democratic primaries, but his allies are running a write-in campaign for him in the state.
Gail Huntley, a Democrat in Hancock, New Hampshire, who plans to write in Mr. Biden鈥檚 name on Jan. 23, said she received the call at about 6:25 p.m. on Jan. 21.
She instantly recognized the voice as belonging to Mr. Biden but quickly realized it was a scam because what he was saying didn鈥檛 make sense. Initially, she figured his words were taken out of context.
鈥淚 didn鈥檛 think about it at the time that it wasn鈥檛 his real voice. That鈥檚 how convincing it was,鈥 she said, adding that she is appalled but not surprised that AI-generated fakes like this are spreading in her state.
White House press secretary Karine Jean-Pierre confirmed Jan. 22 that the call 鈥渨as indeed fake and not recorded by the president.鈥 Mr. Biden鈥檚 campaign manager, Julie Chavez Rodriguez, said in a statement that the campaign is 鈥渁ctively discussing additional actions to take immediately.鈥
Mr. Biden聽and Vice President聽Kamala Harris聽will share the stage on Jan. 23 in Virginia as they campaign for abortion rights, a top issue for Democrats in an election expected to feature a rematch with聽Donald Trump, the former Republican president.
The apparent attempt at voter suppression using rapidly advancing generative AI technology is one example of what experts warn will make 2024 a year of unprecedented election disinformation around the world.
Generative AI deepfakes already have appeared in campaign ads in the 2024 presidential race, and the technology has been misused to spread misinformation in multiple elections across the globe over the past year, from Slovakia to Indonesia, and Taiwan.
鈥淲e have been concerned that generative AI would be weaponized in the upcoming election and we are seeing what is surely a sign of things to come,鈥 said Hany Farid, an expert in digital forensics at the University of California, Berkeley, who reviewed the call recording and confirmed it is a relatively low-quality AI fake.
As AI technology improves, the federal government is still scrambling to address it. Congress has yet to pass legislation seeking to regulate the industry鈥檚 role in politics despite some bipartisan support. The Federal Election Commission is weighing public comments on a petition for it to regulate AI deepfakes in campaign ads.
Though the use of generative AI to influence elections is relatively new, 鈥渞obocalls and dirty tricks go back a long ways,鈥 said David Becker, a former U.S. Department of Justice attorney and election law expert who now leads the Center for Election Innovation and Research.
He said it鈥檚 hard to determine whether the main intent of the New Hampshire calls was to suppress voting or simply to 鈥渃ontinue the process of getting Americans to untether themselves from fact and truth regarding our democracy.鈥
鈥淭hey don鈥檛 need to convince us that what they鈥檙e saying, the lies they鈥檙e telling, are true,鈥 he said. 鈥淭hey just need to convince us that there is no truth, that you can鈥檛 believe anything you鈥檙e told.鈥
Katie Dolan, a spokeswoman for the campaign of Rep. Dean Phillips of Minnesota, who is challenging Mr. Biden in the Democratic primary, said Mr. Phillips鈥 team was not involved and only found out about the deepfake attempt when a reporter called seeking comment.
鈥淎ny effort to discourage voters is disgraceful and an unacceptable affront to democracy,鈥 Ms. Dolan said in a statement. 鈥淭he potential use of AI to manipulate voters is deeply disturbing.鈥
This story was reported for The Associated Press.聽