on useful idiots
In 2016, the Russian Federation famously used $100,000 Facebook advertisements to attempt to influence American voters. The outcomes were mixed; some were so poorly conceived and riddled with grammatical errors that even supercharging their reach with an ad buy couldn’t get them off the ground. Others hit their targets, generating engagements and new page followers, helping the Russian government create community around hot button issues that Americans were discussing, including race and immigration.
Some ads promoted content from Williams and Kalvin, a pair of Black men who—in heavy Nigerian accents—claimed to be from Atlanta, shared content that denigrated prominent Democratic politicians, and encouraged Black voters to stay home on Election Day. Despite boosting their content with ads, Williams and Kalvin never really took off, and eventually, among the revelations of Russian interference in 2017, they were found out to be contractors for the Internet Research Agency, hired to launder Russian narratives to unwitting American audiences.
Eight years later, Russia can no longer buy ads in rubles, and social media platforms, voters, and news organizations have better awareness about foreign influence campaigns and disinformation more broadly. But the Kremlin hasn’t stopped information laundering; instead, it has refined it.
This week, the Department of Justice unsealed an indictment that shows how sophisticated these operations have become. Rather than hire unknowns with no following to build an audience from scratch, the indictment alleges that RT, the Russian government’s foreign propaganda network, set up a clever scheme: RT identified a US-based company willing to act as a conduit between itself and authentic American influencers with built-in, highly-engaged audiences. RT then paid and gave editorial guidance to the company, which asked the influencers to make videos around hot button topics that amplified political polarization; the commentators seemingly agreed.
The Russians chose wisely; the company that they contracted, TENET Media, based in Tennessee and set up for the express purpose of this contract, was allegedly able to establish relationships with two influencers with YouTube subscriber bases of over 2.4 and 1.3 Million subscribers, respectively: conservative commentators Dave Rubin and Tim Pool. Over the course of their relationship, RT paid TENET nearly $10 Million to facilitate these and four other relationships, with the influencers producing hundreds of videos in the past year-plus. The indictment alleges they earned hundreds of thousands of dollars per month in this arrangement.
Rubin and Pool released statements lamenting being “deceived” and victimized by the Russians, but they failed to perform basic due diligence about the funding for the TENET project. When one of the influencers requested more information about the false persona who RT claimed was funding the work, he was provided with a laughable “profile” including a glamor shot of a man on a private plane, errant capital letters, strange phrasing, missing spaces and a number of conservative buzzwords. “Eduard Grigoriann,” the profile alleged, was focused on "championing free speech," had observed “multiple instances of misrepresentations and bias in mainstream media,” and acquired an "alternative perspective on world events" that he hoped TENET Media would support. Of course, Google searches of “Eduard Grigoriann” and his alleged work history yield no results, but no matter: that was all these influencers needed to sign on the dotted line.
Since it started operating, TENET and its influencers have produced content on many hot button issues in American society, from abortion rights to trans healthcare to Ukraine to immigration. It even weighed in on me, with TENET commentator Matt Christensen falsely claiming that my job leading a disinformation policy group for the Department of Homeland Security was tantamount to censorship, and that the harassment and death threats I have received as a result of that false narrative being spread “[we]ren't nearly mean enough because if you take our money to try to control our views, yeah, you should be called a [long string of gendered, sexualized expletives].” He told me, in this likely Russian propaganda video for which he was allegedly paid handsomely, that I should “go monitor [my] baby instead of trying to monitor the entire internet," and signed off by claiming that the non-profit I run was “a way to get paid to whine about it.” Then he implored his audience to “check out all of TENET's social media presence for more great original content.”
Clearly, these influencers who enjoy a combined audience of millions of engaged internet users need a lesson in “know your customer” and a healthy dose of information literacy. But what about the rest of us? In a world where the Russian government is contracting useful idiots to drive polarization in democratic societies, what hope is there for average citizens, particularly while social media platforms roll back their efforts to address online harms?
I’ve been discussing the pernicious threat of information laundering for years. In my 2020 book How to Lose the Information War, I wrote “Russia’s influence campaigns in the United States manipulate local actors to deliver a divisive message, increasing its viability, believability, and making the problem much more challenging to solve. These homegrown actors...amplify discord and emphasize that simply deleting fake accounts and posts”—or, as I like to call it, playing “whack-a-troll”—cannot win the information war. Bolstering individual information literacy, however, has a large role to play.
The single most important rule that any consumer of online information can remember between now and November, as the United States navigates an increasingly complex information environment ahead of a crucial election, is not a trick that allows you to tell AI-generated from organic content, or Russian shill from American influencer. It is that the more enraging the content, the more engaging the content. The commentators that RT paid recognized this, trafficking in rage bait, weaponizing pre-existing fissures in our society to drive clicks and revenue.
A recent study confirmed that learning about emotional manipulation helps individuals recognize disinformation; researchers from the Universities of Cambridge and Copenhagen learned that, if primed with the knowledge that disinformers use emotional manipulation to spread falsehoods, study participants were better able to identify the veracity of content.
So with the most recent revelations of Russian electoral interference co-opting American influencers, the correct response isn’t to distrust everything, expecting a Russian bear behind every post. Instead, consume online content more deliberately, and when you feel yourself getting emotional, stop and consider who might be manipulating you and why. They might be funded by a foreign government seeking to sow discord in our society, but they also might be an influencer simply seeking to elicit a response and boost their subscriber base and their income. Neither is looking out for your interests; neither is looking out for the truth.