Generative AI has many well-documented abuses, from fabricating academic papers to copying artists. And now it appears to be manifesting itself in state influence operations.
According to a recent report by Massachusetts-based threat intelligence firm Recorded Future, one recent campaign was powered by a commercial AI voice generation product, including technology published by high-profile startup Eleven Labs. is extremely high.”
The report describes a Russia-aligned campaign called “Operation Undercut” aimed at undermining support for Ukraine in Europe, using AI-generated voiceovers in fake or misleading “news” videos. Used prominently.
These videos were aimed at a European audience and covered themes such as attacking Ukrainian politicians as corrupt and questioning the usefulness of military aid to Ukraine. For example, one video refers to the devices used by U.S. tanks to deflect incoming missiles, touting that “not even jamming devices can save America's Abrams tanks,” and sending high-tech armor to Ukraine. emphasizes that it is meaningless.
The report states that the video creators “very likely” used audio-generating AI, including Eleven Labs' technology, to make their content appear more legitimate. To test this, researchers at Recorded Future sent clips to ElevenLab's proprietary AI audio classifier, a feature that allows anyone to “detect whether an audio clip was created using ElevenLab.” and got a match.
Eleven Lab did not respond to a request for comment. Recorded Future noted that several commercial AI voice generation tools may be in use, but did not name any tools other than Eleven Labs.
The usefulness of AI voice generation was inadvertently showcased by influence campaign orchestrators, who rather sloppily published several videos containing real human narration with “obvious Russian accents.” did. In contrast, the AI-generated narration was spoken in multiple European languages, including English, French, German, and Polish, without any foreign-sounding accents.
According to Recorded Future, AI also allowed misleading clips to be quickly released in multiple languages spoken in Europe, including English, German, French, Polish, and Turkish. (By the way, all languages are supported by Eleven Labs).
Recorded Future reports that this operation began in March this year when the U.S. government operated a network of “more than 60 websites masquerading as genuine news organizations in Europe, and subsequently created fake websites to amplify misleading information.” The social media account was sanctioned by the Social Design Agency, a Russia-based organization. Spoofed Website Content. ” At the time, the U.S. State Department said all of this was done “on behalf of the Government of the Russian Federation.”
Recorded Future concluded that the campaign's overall impact on European public opinion was minimal.
This is not the first time Eleven Labs' products have been accused of misuse. A voice fraud detection company has concluded that its technology was behind robocalls impersonating President Joe Biden urging voters not to go out and vote during the January 2024 primary, Bloomberg reported. . In response, Eleven Labs said it has released new safety features, including automatically blocking politicians' audio.
Eleven Labs prohibits “unauthorized, harmful, or deceptive impersonation” and says it uses a variety of tools, including both automated and human moderation, to enforce this. .
Eleven Lab has experienced explosive growth since its establishment in 2022. TechCrunch previously reported that the company's ARR recently increased to $80 million from $25 million a year ago, and could soon reach a value of $3 billion. Investors include Andreessen Horowitz and former Github CEO Nat Friedman.