welcomeToVoters got a call from Joe Biden telling them to skip the New Hampshire primary. It was fake.-LoTradeCoin Wealth Hubwebsite!!!

LoTradeCoin Wealth Hub

Voters got a call from Joe Biden telling them to skip the New Hampshire primary. It was fake.

2024-12-27 10:04:55 source:lotradecoin spot trading tutorial Category:News

MANCHESTER, N.H. – The New Hampshire Attorney General’s office is investigating a fake robocall that uses President Joe Biden’s voice to try to dissuade Democratic voters from participating in the state’s primary on Tuesday.   

Multiple voters received the artificially generated message on Sunday night, telling them: “Your vote makes a difference in November, not this Tuesday.” The call also included the personal phone number of Kathy Sullivan, a key leader in the effort encouraging Democrats to write in Biden’s name on the ballot.

"We know there are anti-democratic forces out there who are terrified by the energy of this grassroots movement to stop Donald Trump, but New Hampshire voters will not stand for any efforts to undermine our right to vote,” Sullivan said in a statement about the deepfake.  

A former New Hampshire Democratic Party chair, Sullivan is now running a super PAC supporting the broader write in effort to help reelect Biden. She was one of the first Democratic leaders in the state to rally around the president after he announced that he would not appear on the primary ballot last fall.

In a phone call with USA TODAY, Sullivan said she felt “disgust” and “disbelief” over the fake robocall and her personal information being leaked.

Prep for the polls: See who is running for president and compare where they stand on key issues in our Voter Guide

"Anyone who did this, if they think they're a patriot, you're not," Sullivan said. "You're not a good American, because you're trying to interfere with the fundamental processes of what makes our country different and better." 

The source of the fake call remains unclear. Sullivan said the culprit appears to have contacted a "very random" list of phone numbers, including the Manchester hockey arena where former President Donald Trump held a rally Saturday night.

Sullivan has been in contact with the state attorney general’s office, and the Biden campaign said it is “actively discussing additional actions,” it can take to help voters. 

But the incident could point to a broader problem. Miles Taylor, a former senior Department of Homeland Security official, said he and other cybersecurity experts have been bracing for the malicious use of deepfakes in the 2024 presidential election.

Deepfakes are videos or images that have been digitally created or changed with artificial intelligence or other technology.

“We’ve been working with US officials on the expected surge in deepfakes,” Taylor said in a post on X, formerly known as Twitter. “This is just the beginning.” 

What to know:President Joe Biden won't be on the New Hampshire primary ballot

The threat of deepfakes

The misuse of deepfakes during an election has long been a concern of U.S. government and private sector security officials, even before they started showing up during campaign seasons. 

Last June, Florida Gov. Ron DeSantis’ campaign reportedly used images of Trump embracing Dr. Anthony Fauci in a campaign video that forensic experts said were almost certainly realistic-looking deepfakes generated by artificial intelligence, the USA TODAY Network reported at the time. 

A month before that, Sen. Richard Blumenthal, D-Conn., launched a Senate Judiciary Committee hearing into the potential pitfalls of deepfakes by playing an AI-generated recording that mimicked his voice and read a ChatGPT-generated script. 

“If you were listening from home, you might have thought that voice was mine and the words from me,” the real Blumenthal said in revealing the deep fake, warning that the technology could be game-changing in terms of “the proliferation of disinformation, and the deepening of societal inequalities.” 

One former Department of Homeland Security cyber official warned that the fake Biden call in New Hampshire could become the new normal given the rapid advances in technology and the lack of comprehensive government and private sector oversight.

“Obviously, there are risks with AI, particularly in the political sphere where there is a winner-takes-all issue like an election,” said the former official, who spoke on the condition of anonymity because of their current role with a social media company involved in protecting against deep fakes. 

“That’s where you are going to see the most targeted and arguably most insecure uses of AI because the incentives for the players are absolutely to sort of kill the other guy. So you see them innovating quickly and incorporating new technologies,” the former Homeland Security official said. ”The problem is right now, we don't have all the answers about how to secure AI, or how to use AI for security. I think everyone is trying to get their legs on under them.” 

New Hampshire state authorities investigating

The New Hampshire Attorney General’s Office in a statement confirmed that it has received complaints regarding the recorded message and launched an investigation.

“Although the voice in the robocall sounds like the voice of President Biden, this message appears to be artificially generated based on initial indications,” the statement from Attorney General John M. Formella said.

“These messages appear to be an unlawful attempt to disrupt the New Hampshire Presidential Primary Election and to suppress New Hampshire voters,” Formella added. “New Hampshire voters should disregard the content of this message entirely. Voting in the New Hampshire Presidential Primary Election does not preclude a voter from additionally voting in the November General Election.”

The Granite State official also said people who received the call were encouraged to send an e-mail to the state Department of Justice Election Law Unit with details about the date and time they received the call or message, its origin and its content. The investigation remains ongoing.