The U.S. Department of Justice (DOJ) seized 32 Russian web domains it alleges have been used in a “Russian government-directed” influence campaign on to interfere with American and other foreign elections.
According to the statement made by the DOJ on Wednesday (Sep, 4), Russian companies Social Design Agency (SDA), Structura National Technology (Structura), and ANO Dialog have been operating under the direction of the Russian government to “covertly spread Russian government propaganda with the aim of reducing international support for Ukraine, bolstering pro-Russian policies and interests, and influencing voters in U.S. and foreign elections, including the U.S. 2024 Presidential Election.”
Attorney General Merrick B. Garland said “The sites we are seizing today were filled with Russian government propaganda that had been created by the Kremlin,” continuing to add that “Our actions today make clear that the Justice Department will be aggressive in countering and disrupting attempts by the Russian government, or any other malign actor, to interfere in our elections and undermine our democracy.”
Today, the Justice Department announced the ongoing seizure of 32 internet domains used in Russian government-directed foreign malign influence campaigns colloquially referred to as “Doppelganger,” in violation of U.S. money laundering & criminal trademark laws. pic.twitter.com/zSaxVqrLyS
— U.S. Department of Justice (@TheJusticeDept) September 4, 2024
Claims other Russian actors were working with influencers
Also on Wednesday, the DOJ indicted two Russian nationals operating under RT (formerly Russia Today, a Russian state-backed media outlet) for their part in “a $10 million scheme to create and distribute content to U.S. audiences with hidden Russian government messaging.”
The two people allegedly financed a Tennessee-based content-creation company that posted nearly 2,000 videos since late 2023 to platforms such as YouTube, TikTok, Instagram, and X, garnering over 16 million views. The videos focused on domestic issues in the U.S. such as immigration and inflation, and expressed views that were designed to “amplify domestic divisions in the United States.”
The indictment claims a media company associated with six conservative influencers, including prominent figures like Tim Pool, Dave Rubin, and Benny Johnson, were funded to produce videos on topics supporting the Kremlin’s interests, reported the Associated Press. Pool and Johnson were quick to release statements denying any wrongdoing.
My statement regarding allegations and the DOJ Indictment
Should these allegations prove true, I as well as the other personalities and commentators were deceived and are victims. I cannot speak for anyone else at the company as to what they do or to what they are instructed…
— Tim Pool (@Timcast) September 4, 2024
“The Justice Department will not tolerate attempts by an authoritarian regime to exploit our country’s free exchange of ideas in order to covertly further its own propaganda efforts, and our investigation into this matter remains ongoing,” said Attorney General Merrick B. Garland.
“Covert attempts to sow division and trick Americans into unwittingly consuming foreign propaganda represents attacks on our democracy,” said FBI Director Christopher A. Wray.
Concerns worldwide of election interference
Earlier this year, a report from Microsoft alleged that China was using AI tools to interfere in foreign elections, specifically elections in Taiwan.
Deepfake videos have been increasing as a matter of concern in political conversations. UK politician James Cleverly said earlier this year “The era of deepfake and AI-generated content to mislead and disrupt is already in play, increasingly today the battle of ideas and policies takes place in the ever-changing and expanding digital sphere.”
U.S. lawmakers have begun to enact laws to limit the impact deepfakes can have on US elections, though many new regulations will only come into play in the 90 days before an election due to concerns about violating First Amendment rights.
This has led to some AI companies developing shields to limit their models’ scope to engage with election and politics-related queries. This includes Anthropic, which has introduced “Prompt Shield” to redirect Claude AI users to a site that provides nonpartisan political information, and Google’s Bard, which will limit election-based prompts responded to.
Featured image credit: generated by Ideogram