[ad_1]
Steve Kramer, a veteran political consultant working for a rival candidate, acknowledged Sunday that he commissioned the robocall impersonating President Joe Biden with artificial intelligence, confirming an NBC News report that he was behind the call.
Kramer expressed no remorse for creating the deepfake, in which an imitation of the president’s voice discouraged participation in New Hampshire’s Democratic presidential primary. The call launched several law enforcement investigations and provoked outcry from election officials and watchdogs.
“The evening of Sunday, January 20th, 2 days before the New Hampshire primary, I sent out an automated call to 5,000 most likely to vote Democrats. Using easy to use online technology, an automated version of President Joe Biden’s voice was created,” Kramer said in a statement shared first with NBC News.
Kramer said more enforcement is necessary to stop people like him from doing what he did.
“With a mere $500 investment, anyone could replicate my intentional call,” Kramer said. “Immediate action is needed across all regulatory bodies and platforms.”
Kramer did not say he was directed to make the call by his client at the time, the campaign of Biden’s longshot primary challenger. Rep. Dean Phillips, D-Minn., had paid Kramer over $250,000 around the time the robocall went out in January, according to his campaign finance reports.
Phillips and his campaign have denounced the robocall, saying they had no knowledge of Kramer’s involvement and would have immediately terminated him if they had known.
Phillips press secretary Katie Dolan said in response to Kramer’s statement Sunday, “Our campaign repeats its condemnation of these calls and any efforts to suppress the vote.”
Phillips’ team hired Kramer in December and January for ballot access work in New York and Pennsylvania, which involves collecting thousands of signatures from voters so a candidate can qualify for the ballot.
Kramer was first linked to the fake Biden robocall Friday by an NBC News report.
Paul Carpenter, a New Orleans street magician, said Kramer hired him to use AI software to create an audio file replicating Biden’s voice reading a script Kramer prepared and provided.
Carpenter was paid $150 for the work, according to Venmo transactions and text messages he shared with NBC News
Until Carpenter came forward, it was unclear whether the world would ever know who was behind the first-known use of an AI deepfake in a presidential campaign.
Kramer, a get-out-the-vote specialist and president of his own small firm, has worked on dozens of federal, state and local campaigns over the past 20 years. He has worked mostly for Democrats and his career grew out of his involvement with Young Democrats of America, though his most prominent client was likely Ye, the rapper formerly known as Kanye West, who hired Kramer for his brief 2020 independent presidential campaign.
Carpenter, a nomadic performer, says he met Kramer through a mutual acquaintance last year and was under the impression that Kramer was working for the Biden campaign when he asked Carpenter to create the audio file that eventually became the robocall.
Authorities in New Hampshire are investigating the robocall for potentially violating state laws against voter suppression. A multi-state task force of state attorneys generals focused on robocalls is looking to crack down on the people involved in the Biden robocall in order to set an early example as the technology becomes more widespread. And the Federal Commissions Commission sped up plans to criminalize AI robocalls in response to the Biden robocall.
In his statement, Kramer confirmed that to distribute the calls, he hired the Texas telemarketing company, Life Co., which has been named by investigators as the originator of the calls.
“They had no knowledge of the content of this call prior to delivery,” Kramer wrote. “I’d use them again, but they are done with my business.”
He also confirmed Carpenter’s account that he directed the content of the fake Biden robocall, saying the call was created “using a script of my specific choosing.”
Kramer seems to be attempting to spin the situation by arguing his actions will have a positive benefit for society by provoking the implementation of stricter guardrails before this technology becomes widespread in campaigns. “Self-policing won’t work,” he wrote.
“Even individuals acting alone can quickly and easily use A.I. for misleading and disruptive purposes,” Kramer added.