FBI, cyber-cops zap ~1K Russian AI disinfo Twitter bots

4 months ago 30
BOOK THIS SPACE FOR AD
ARTICLE AD

The FBI and cybersecurity agencies in Canada and the Netherlands say they have taken down an almost 1,000-strong Twitter bot farm set up by Russian state-run RT News that used generative AI to spread disinformation to Americans and others.

The international crime busters seized two web domains and 968 X accounts that were integral parts of the bot farm, the US Department of Justice said in a statement Tuesday.

We're told the X accounts were used to "sow discord in the United States and elsewhere," in hope of causing people to squabble among themselves and distrust one another, which would suit the Kremlin.

In taking down the operation, the FBI worked with America's Cyber National Mission Force (CNMF), the Canadian Centre for Cyber Security (CCCS), the Netherlands General Intelligence and Security Service (AIVD), the Netherlands Military Intelligence and Security Service (MIVD), and the Netherlands Police.

Additionally, X worked with the Feds to suspend accounts that were said to be part of the misinformation ring.

Twitter grew '1.6%' since Musk's $44B takeover

READ MORE

US prosecutors say the Twitter bot farm was formed by the RT News Network – aka Russia Today, a state-run media network and propaganda outlet for the Kremlin – and then operated by Russian counter-intelligence. In particular, the Feds say "Individual A," described by the Americans as "the deputy editor-in-chief at RT," was allegedly responsible for setting up the bot farm in early 2022.

Individual A is accused of overseeing efforts to develop software that would be able to spread disinformation "on a wide-scale basis" via the bots. To this end, the journo allegedly recruited a Russian developer called "Individual B," who also has not yet been named. An affidavit [PDF] authored by an FBI agent, which lists all the now-taken-down bot accounts, claimed the accused programmer took great efforts to conceal their identity, having created at least three fictional personas.

The affidavit claims Individual B began building out the bot farm by purchasing an American domain name from Namecheap with Bitcoin in April 2022.

In early 2023, an Federal Security Service (FSB) agent started using the bot farm with approval from the Russian government, which also offered financial support, it is claimed. The operation then expanded to involve multiple FSB agents.

Bot farm assisted by generative AI according to feds

It goes unmentioned when exactly the bot farm began operating in earnest, but according to the affidavit concerning the seizure of the 968 X accounts, the oldest ones were created in June 2022.

According to an advisory [PDF] written by the agencies that took the bot farm down, RT's operation allegedly used software termed Meliorator.

"Meliorator was designed to be used on social media networks to create 'authentic' appearing personas en masse," the advisory claims, "allowing for the propagation of disinformation, which could assist Russia in exacerbating discord and trying to alter public opinion as part of information operations."

The software only worked with X by the time the bot farm was shut down, though the Feds suspect there were plans to make it compatible with other platforms in the future.

Meliorator interfaced with another piece of software, called Taras, which created and automatically operated the fake accounts by using what are termed "souls" and "thoughts," it's claimed. An account would be created with a randomly generated soul, that would define the personality of the bot's output, and then assigned thoughts, which would direct the bots to perform actions and interact with other accounts. This is all supposedly powered by generative AI models.

Ukraine busts bot farm spreading Russian infowar propaganda and fraud Ukraine security agency shutters Russian disinformation bot farms Devs claim Apple is banning VPNs in Russia 'more effectively' than Putin UN telecom watchdog wags finger at Russia for satellite interference

The Feds said they found that there were three general forms of soul; one featuring a completely filled-out profile with a name, personal background info, and photos generated by an open source tool that uses AI to make up fake data, and used to vocally spread political opinions; another archetype with very little info at all used for liking other bots' posts; and a third designed to look very real, gain lots of followers, and signal boost content posted by both bots and real accounts.

The bot farm implemented various strategies to avoid getting caught, such as having the bots follow big accounts with more than 100,000 followers and following accounts with opinions that matched the corresponding soul, using proxy IP addresses individual to each soul, and bypassing X's bot identification measures.

However, the bot farm probably made a big mistake by signing up its 968 X accounts through the two domains it owned, and that probably made it pretty easy for the Feds and X to find them all, it's said.

"The Justice Department and our partners will not tolerate Russian government actors and their agents deploying AI to sow disinformation and fuel division among Americans," said US Deputy Attorney General Lisa Monaco.

"As malign actors accelerate their criminal misuse of AI, the Justice Department will respond and we will prioritize disruptive actions with our international partners and the private sector. We will not hesitate to shut down bot farms, seize illegally obtained internet domains, and take the fight to our adversaries."

The Register reached out to RT News on whether the allegations were true, and the Russian broadcaster merely told us: "Farming is a beloved pastime for millions of Russians." ®

Read Entire Article