HomeWinBuzzer NewsHow Russian Hackers Leverage ElevenLabs for Multilingual AI Disinformation Campaigns

How Russian Hackers Leverage ElevenLabs for Multilingual AI Disinformation Campaigns

Investigators have uncovered how ElevenLabs’ AI voice technology has been utilized in Operation Undercut, a Russian campaign aimed at destabilizing Western alliances.

-

A disinformation campaign known as “Operation Undercut,” attributed to Russia’s Social Design Agency (SDA), has drawn significant attention for its use of advanced AI tools, including ElevenLabs’ voice generation technology.

The operation, analyzed by  threat intelligence company Recorded Future, highlights the increasing role of AI in creating credible, multilingual content aimed at influencing public opinion and destabilizing Western unity.

ElevenLabs at the Center of Disinformation Tactics

At the core of Operation Undercut’s strategy is its use of AI-generated voiceovers, with tools like ElevenLabs playing a key role.

ElevenLabs is an AI-driven platform specializing in ultra-realistic text-to-speech and voice cloning technologies. Their services enable users to convert text into lifelike speech across 32 languages, offering a diverse library of voices and the capability to create custom voice clones that capture unique vocal characteristics.

These tools are already widely utilized in various sectors, including audiobooks, video voiceovers, gaming, and accessibility solutions, providing high-quality, human-like audio content for a global audience.

The AI voiceovers, produced in multiple languages such as English, German, and French, allow the campaign to craft content that resonates with diverse audiences while maintaining a semblance of authenticity.

According to Recorded Future, “Operation Undercut uses video content with AI-generated voiceovers and images that reference real articles by Western media outlets but take fragments of information out of context and deceptively repackage the information to support its influence objectives.”

ElevenLabs’ technology, which specializes in creating high-quality, human-like voice synthesis, has been instrumental in enhancing the credibility of Operation Undercut’s content. Recorded Future identified over 1,190 videos with 233 unique transcripts, many of which were likely created using commercial AI tools such as ElevenLabs.

The ability to produce voiceovers indistinguishable from human speech has made these narratives more convincing, bolstering their potential impact.

A Sophisticated Campaign with Unique Characteristics

Unlike Doppelgänger, another SDA-linked operation, Operation Undercut relies on human-operated accounts rather than automated amplification. This shift in tactics reflects an evolution in disinformation strategies, prioritizing direct engagement and the use of original, AI-enhanced content.

Recorded Future notes, “Operation Undercut’s main objective was almost certainly to weaken Western support for Ukraine and exacerbate domestic and international anxieties around the 2024 US elections.”

SDA’s resources, including video editors and graphic designers, have enabled the creation of highly tailored content targeting specific global narratives.

By exploiting socio-political issues such as the Israel-Gaza conflict, the European Union’s internal divisions, and even the Paris Olympics, the campaign amplifies existing tensions to destabilize its targets.

Exploiting Global Narratives with AI

Operation Undercut’s focus on Ukraine is a central theme. It seeks to discredit Ukrainian leadership, framing military aid as ineffective and alleging corruption among officials. For example, videos accuse Ukrainian leaders of using foreign aid to purchase luxury properties abroad, fueling mistrust among Western audiences.

Operation Undercut has targeted Ukrainian military and political figures with content designed to undermine their credibility and erode Western support for Ukraine.

Ukrainian military and political figures targeted by Operation Undercut content (Image: Recorded Future)

The campaign frequently accuses officials of corruption, alleging that they have misused foreign military aid and government funds for personal luxuries, including purchasing luxury villas in Europe and the Caribbean.

These narratives are strategically tailored for specific audiences, with many of these accusations presented in French and German to influence public opinion in countries critical to sustaining military aid to Ukraine.

By framing Ukrainian leaders as unreliable and corrupt, the operation seeks to weaken trust in Ukraine’s ability to effectively utilize Western support and foster doubts about continued international backing.

The campaign also connects global conflict resolution to the outcome of the 2024 US presidential election, suggesting that the future of Western military aid hinges on its results. This approach amplifies anxieties both domestically and internationally, creating uncertainty around geopolitical stability.

(Image: Recorded Future)

The Israel-Gaza conflict is another area of focus. The operation disseminates claims of Western governments’ complicity in controversial actions, including accusations that intelligence agencies from the US, UK, and Turkey have supported Hamas.

Additionally, Operation Undercut amplifies narratives surrounding protests in Western countries, such as portraying pro-Palestinian demonstrators in New York as carrying Hamas flags and exaggerating demands in London for an end to arms sales to Israel.

By spotlighting these events, the campaign aims to deepen societal divisions, inflame tensions between governments and their citizens, and cast doubt on the West’s ability to maintain a balanced approach to the conflict.

(Image: Recorded Future)

Simultaneously, it targets the European Union by highlighting fractures among member states, supporting far-right movements, and criticizing Brussels’ sanctioning policies.

Operation Undercut targeted Ursula von der Leyen by labeling her as a “puppet” of the German government and accusing her of influencing Polish policies to the detriment of Poland’s economy.

(Image: Recorded Future)

Even cultural events, such as the 2024 Paris Olympics, become tools for disinformation. Narratives surrounding the Olympics emphasize event security concerns, logistical failures, and broader social controversies, resonating with European audiences.

(Image: Recorded Future)

Tactical Sophistication

The reliance on ElevenLabs and other advanced AI tools underscores the campaign’s sophistication. These technologies enable the creation of content that mimics legitimate media, recontextualizing articles from outlets like The New York Times and Politico to lend false credibility.

Recorded Future observes, “The volume of unique, original content posted by Operation Undercut accounts shows that the network’s operators very likely have access to in-house content creation capabilities, including graphic designers, video editors, and illustrators.”

Localized hashtags and engagement with niche platforms like 9gag, a poplar page for sharing memes, extend the campaign’s reach. These efforts are complemented by usage of other networks, such as CopyCop, an AI tool for generating “compliant” Facebook ad copies, adding layers of complexity and adaptability to the operation.

Despite its sophisticated design, Operation Undercut’s organic engagement remains limited. Recorded Future classifies its activity as “Category 2 on the Brooking Institution’s Breakout Scale, reflecting cross-platform activity without significant audience engagement.”

However, the use of AI tools like ElevenLabs raises concerns about the scalability of disinformation campaigns, signaling a need for vigilance.

Addressing the Threat

To combat the rising influence of AI-enhanced disinformation, Recorded Future emphasizes the importance of proactive measures. Governments, tech companies, and civil society must collaborate to monitor and counter emerging threats.

By analyzing patterns of impersonation and AI-driven content creation, stakeholders can protect public discourse and mitigate risks associated with advanced disinformation campaigns.

Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x