By Rhianwen Watkins, Granite State News Collaborative
Ever since a robocall impersonating the voice of President Biden was transmitted around New Hampshire in January, experts in artificial intelligence and others involved in politics and government have expressed worries about the future impacts of the technology on upcoming elections.
The New Hampshire robocalls — transmitted on Jan. 21, just two days before the state’s first-in-the-nation presidential primary — used an AI-generated voice of Biden that urged the public not to show up to the polls to vote for Biden in the primary, falsely hinting that voting in the primary would prevent voters from casting a ballot in November’s general election.
At the end of the call, a phone number was provided claiming people could dial it to “opt out” of receiving more political calls. The number, however, belongs to Kathy Sullivan, former chair of the state Democratic Party and a well-known party activist who had nothing to do with the call.
“It was upsetting and just infuriating, because obviously, somebody was trying to suppress the vote,” Sullivan said. “Plus it’s a little creepy knowing that somebody was so malicious as to spoof my phone number. It just seemed kind of personal.”
The apparent attempt to suppress voting was tracked down to a political consultant named Steve Kramer. Kramer commissioned a New Orleans street magician who has experience in audio recording to create the robocall and hired two Texas companies to transmit the call.
Kramer insisted in interviews with multiple news outlets that his only intention in making the robocall was to inform the public of the dangers of AI. But Sullivan has her doubts.
“If that was true, if he was trying to do something so civic-minded, then why the hell did he spoof my phone number?” Sullivan said. She added that she thinks the inclusion of her number had an underlying motive to disrupt the Biden write-in effort, which Sullivan has urged voters to take part in, as well as staying vocal about her support for the president.
At the time the robocall was made, Kramer was working on the campaign of Dean Phillips, a Democratic congressman from Minnesota who was challenging Biden in the primary. The Phillips campaign paid Kramer over $250,000 for his services, according to Federal Election Commission reports. Phillips’s team has denied any involvement in the robocall.
As it stands, the state Attorney General’s Office is currently investigating Kramer and the robocall.
“I'm hoping that soon we'll hear something about them taking legal action against this fellow Kramer who was behind it,” said Sullivan. “The whole thing was such a bizarre experience and it’s not over yet.”
Fast-evolving
According to David Scanlan — who, as New Hampshire’s secretary of state, oversees elections in the state — one of the most difficult aspects of AI’s use in political campaigns is that people are still trying to understand the scope of it and have to play “catch-up” while it continues to quickly evolve.
“I think with what we saw here a few months ago, in New Hampshire, it's pretty clear that Pandora's box is open at this point,” said Jim Merrill, a Republican political consultant with Bernstein Shur who has worked on the presidential campaigns of Mitt Romney and Marco Rubio. “AI is a tool that we're seeing applications for in a variety of industries. So it's no surprise it's beginning to show up in political campaigns, and I think that's only going to increase in the months ahead.”
Merrill said responsibility will significantly fall on campaigns and their communications teams to respond quickly to misinformation about the candidates they are representing. However, news outlets also will have to be on alert, he emphasized, and be ready to quickly report on falsehoods as they appear.
Scanlan added that voters also have a responsibility to conduct their own research and discern for themselves what is and isn’t true as the elections draw closer.
“I think for everyone, there is some responsibility here to hopefully minimize the really negative and concerning impacts that AI can have on elections,” Merrill emphasized.
However, Jeremiah Johnson, associate professor of data science at the University of New Hampshire and an AI researcher, warned that it is becoming increasingly difficult to tell what is AI and what isn’t.
“There have been efforts to develop digital watermarks that would identify AI-generated content, but they're not very good, and they can be removed. So it's unclear whether that will provide any sort of solution,” Johnson said. “It just creates an environment where it becomes very difficult to trust anything aside from a flesh-and-blood human standing in front of you.”
Johnson added that AI-generated photos can sometimes be identified by spotting irregularities in peoples’ hands or teeth who appear in the images.
One example was a fabricated photo that recently circulated depicting former president and Republican nominee Donald Trump surrounded by groups of Black people.
The image, in addition to having no connection to the Trump campaign, appeared to show people with peculiar-looking hands in strange lighting and made the former president’s face look ever-so-slightly different than usual.
However, said Johnson, a lot of those telling details no longer exist in newer AI images. “The models are already past those issues,” he said. “So it's really a challenging problem.”
Merrill agreed.
“It's likely that we're going to see lifelike and realistic and compelling storylines that are completely fabricated for the benefit or to the detriment of someone or other. And I think we need to be really guarded and thoughtful about letting that influence us,” he said. “It's going to be really important for everybody with a stake in this to just tread very carefully.”
Legislative action
The robocalls set into motion a number of bills regarding AI use that are currently circulating the New Hampshire Legislature.
Gov. Chris Sununu signed House Bill 1596 into law Aug. 2. The bill requires “a disclosure of deceptive artificial intelligence usage in political advertising.”
The bill states that any audio or video recordings using AI must include a spoken statement at the end that says it was made using the technology. For visual media, the disclosure must be in text that is “easily readable by the average viewer and no smaller than the largest font size of other text appearing in the visual media.”
Beyond AI’s use in campaigns, another bill, Senate Bill 564, which was signed May 31, expanded the ban against child sexual abuse images to include those generated by AI.
Lawmakers also tried to pass SB 464 and HB 1319, bills that would expand the current law prohibiting the spreading of nonconsensual sexual images, to include those that are synthetic and AI-generated.
SB 464 was referred for interim study on Apr. 11, while HB 1319 received the governor’s signature July 22.
HB 1688 was also signed into law in July. It looks to prohibit state agencies from using AI to “manipulate, discriminate, or surveil members of the public.”
In addition to statewide bills, the New Hampshire robocall and the furor it caused prompted the Federal Communications Commission to place a ban on robocalls using AI-generated voices as of mid-February.
Responding and adapting
Despite AI proving to have significant negative influences, Scanlan and Merrill said they felt it could hold some potential benefits as well, if used without malicious intentions.
“A campaign functions on vast amounts of data,” said Merrill. “It's not difficult for me to see how AI may function to allow campaigns to more carefully craft messages and advertising.”
Scanlan added that it can be a helpful tool in informing voters.
“If voters have questions, you can use AI as a search engine that can give a very quick and accurate response on how the election process works,” he said.
“It's making sure that those beneficial applications are promoted,” said Merrill.
However, Johnson said he felt more “skeptical” of positives coming from AI.
In any case, Scanlan said AI is an inevitable part of the future, and the key is learning how to navigate it.
“Candidates and supporters of candidates are always looking for a way to gain some leverage or get a leg up on their opponent, and sometimes they push the limits. And that has always been true,” said Scanlan. “Now AI is just another opportunity for that to happen.”
Merrill echoed this.
“It's probably the tip of the iceberg of what we might expect in the months and years ahead,” Merrill said. “A smart campaign is going to figure out that this is part of the new reality. We can't stick our head in the sand over it. We need to be ready to address it if and when something happens.”
These articles are being shared by partners in The Granite State News Collaborative as part of its What to Expect When You’re Electing Series. For more information visit collaborativenh.org.