Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

ElevenLabs reportedly banned the account that deepfaked Biden's voice with its AI tools

A robocall impersonating the president went out to voters in New Hampshire.

ASSOCIATED PRESS

ElevenLabs, an AI startup that offers voice cloning services with its tools, has banned the user that created an audio deepfake of Joe Biden used in an attempt to disrupt the elections, according to Bloomberg. The audio impersonating the president was used in a robocall that went out to some voters in New Hampshire last week, telling them not to vote in their state's primary. It initially wasn't clear what technology was used to copy Biden's voice, but a thorough analysis by security company Pindrop showed that the perpetrators used ElevenLabs' tools.

The security firm removed the background noise and cleaned the robocall's audio before comparing it to samples from more than 120 voice synthesis technologies used to generate deepfakes. Pindrop CEO Vijay Balasubramaniyan told Wired that it "came back well north of 99 percent that it was ElevenLabs." Bloomberg says the company was notified of Pindrop's findings and is still investigating, but it has already identified and suspended the account that made the fake audio. ElevenLabs told the news organization that it can't comment on the issue itself, but that it's "dedicated to preventing the misuse of audio AI tools and [that it takes] any incidents of misuse extremely seriously."

The deepfaked Biden robocall shows how technologies that can mimic somebody else's likeness and voice could be used to manipulate votes this upcoming presidential election in the US. "This is kind of just the tip of the iceberg in what could be done with respect to voter suppression or attacks on election workers," Kathleen Carley, a professor at Carnegie Mellon University, told The Hill. "It was almost a harbinger of what all kinds of things we should be expecting over the next few months."

It only took the internet a few days after ElevenLabs launched the beta version of its platform to start using it to create audio clips that sound like celebrities reading or saying something questionable. The startup allows customers to use its technology to clone voices for "artistic and political speech contributing to public debates." Its safety page does warn users that they "cannot clone a voice for abusive purposes such as fraud, discrimination, hate speech or for any form of online abuse without infringing the law." But clearly, it needs to put more safeguards in place to prevent bad actors from using its tools to influence voters and manipulate elections around the world.