Disinformation in the age of social media: Renée DiResta’s Black Hat 2020 keynote
Black Hat USA is a major event in the world of information security. This year, due to the ongoing pandemic, the conference went virtual — and the organizers offered open access to many of the most important talks and sessions.
Among these were two timely keynote addresses that deal with issues around election security, one of which was delivered by Renée DiResta. DiResta is the Research Manager at Stanford Internet Observatory, an interdisciplinary program that studies the misuse of information technologies. Her work focuses on the way in which narratives spread through social media and media networks. Her keynote, entitled “Hacking Public Opinion”, explored the ways in which nation-state actors seek to influence public opinion through social media.
Hacking public opinion
DiResta’s talk began by exploring how state-sponsored information operations have evolved over time. She notes that in today’s social media-dominated landscape, disinformation has a far greater chance of “going viral” than in the past. In addition, the very nature of the major social media platforms — whose feeds are algorithmically curated to maximize engagement — is something that can now be exploited by state actors who know how to “game” the algorithms. All of this, she says, amounts to a fundamental shift from the old Cold War model of information operations: from merely shaping public opinion to hacking public opinion.
But while the methods may have changed, the goals are largely the same: distract the public, persuade them to think or feel a certain way, encourage the creation of social groups with deeply entrenched identities, and cause division by exploiting these social divides.
To accomplish their aims in an age of social media, nation states have adopted a full-spectrum approach. They create content that they want to go viral through a variety of channels. This may be done through state-sponsored news agencies like RT and China Daily, or through content farms that echo the state narrative, but can’t be directly tied back to a foreign government. In addition, there are also completely unattributable content producers such as fake “journalists”, conspiracy theory websites, and fraudulent social media pages.
Once content is seeded to social media networks, state actors begin a campaign of coordinated amplification, using fake personas on the social media platform to boost engagement — and to increase the chances of getting organic likes and shares from actual users. Nation-state information operatives may even be able to succeed in getting major news organizations to pick up their narratives.
China vs Russia
Many countries engage in information operations, but their aims are not always the same — nor are their capabilities.
DiResta started by examining the case of China, noting that their information operations are primarily focused on persuasion: on improving public perception of their country abroad, or “telling China’s story well”.
One prominent example of this is China’s so-called “50 Cent Army”, a network of fake Internet commenters, sponsored by Beijing, who are tasked with pushing the government’s preferred narratives on social media platforms. One recent example of their operations comes their Twitter activity during the 2019 Hong Kong protests. Journalists reporting on the protests, and even ordinary users who had expressed sympathy with the protesters, would frequently find themselves on the receiving end of multiple comments from Chinese troll accounts telling them that they’d been misinformed, or that they were misreporting the story. These accounts claimed to be ordinary Hong Kong locals but, of course, were anything but that.
More recently, in response to criticism over their initial handling of COVID-19, Chinese media properties have been running stories praising China’s response to the pandemic, and downplaying any missteps. DiResta notes that many of these media outlets have large Facebook pages, even though Facebook is banned in mainland China! This indicates that the stories they run are not intended for domestic consumption at all, but are instead designed to shape public opinion abroad. Once published to Facebook, narrative-friendly news stories are boosted using the platform’s native ad tools, in the hopes of producing organic engagement. In addition to its state-controlled media outlets and legions of social media trolls, China also uses a number of other channels to drive their narrative, including the social media accounts of prominent government officials as well as less-than-legitimate outlets like conspiracy theory sites.
Nevertheless, China’s information operations have met with mixed success. After analyzing the dataset from Twitter’s June 2020 takedown of 23,800 fake accounts, it was found that 92% of these accounts had fewer than 10 followers. And despite tweeting a collective 350,000 times, the average engagement per tweet was only .81 — meaning that most tweets didn’t receive any interaction at all. Part of the reason for this may be that the social media fakery was fairly low-effort: bot accounts were created on the same day and at the same time, using similar names, stock profile photos, and extremely basic bios.
In light of these findings, DiResta thinks that it is important to take a balanced, objective look at the actual impact of foreign information actors, and not to overstate their effectiveness. She warns that sensationalized reporting of the threat posed by certain state actors may even help them, because it makes them appear more powerful than they really are.
Russia, however, is a far more sophisticated state actor, and is generally much more successful than China at driving social media engagement. They also employ distinct tactics, including direct hacks on public officials and institutions — hacks which are then used in their social media information operations. A prime example of this comes from the 2016 U.S. presidential election, when Clinton campaign chair John Podesta’s emails were hacked in a spear phishing attack, and then subsequently leaked to the media, providing fodder for a number of divisive discourses and conspiracy theories. In addition, Russian operatives seem far more interested than their Chinese counterparts in recruiting local people to serve their ends: in finding ways to turn their audiences into active participants in their information operations.
Perhaps even more worrying is the ultimate goal of the information operations sponsored by Russian military intelligence: to intentionally deepen divisions within U.S. society in an attempt to destabilize it. DiResta notes that Russia’s Internet Research Agency (a shadowy organization with ties to Russian intelligence agencies) creates divisive viral content targeted at both the political Left and the political Right in the United States — indicating that their true aim has nothing to do with partisan politics: They’re simply attempting to sow discord between Americans and create disruption in American society.
The 2020 U.S. elections
DiResta closed by reflecting on how the information operations of a foreign adversary could pose a danger to the upcoming U.S. elections.
First, she says, there is the possibility of new “hack and leak” operations, in which U.S. politicians or organizations are breached, and the contents of the breach are leaked to the media. Such actions, says DiResta, have the power to significantly affect press cycles during campaigns.
Secondly, there is a direct risk to electronic voting machines (there are already indications that these machines were the target of attempted incursions during the 2016 elections). DiResta points out that state-sponsored hackers wouldn’t even need to successfully hack a voting machine to cause disruption: If they simply claimed that they’d infiltrated the election system, they could produce chaos.
Finally, sophisticated foreign actors may be able to use Americans to further their aims — either by paying unwitting locals to boost their own content, or by amplifying social media posts produced by domestic fringe elements.
The goal of such actions would be to exploit underlying divisions in the US, as well as the American public’s lack of trust in their own government, in order to undermine confidence in the 2020 elections.
DiResta ended her talk with a call for professional collaboration: “We need to increase communication between infosec professionals and information operations researchers, with a goal of developing a better understanding of how social network manipulation intersects with network infiltration — in service to predicting and mitigating these attacks”.
Renée DiResta’s keynote is an important reminder that foreign adversaries are actively engaged in attempts to divide U.S. society — and that many of the news stories, videos, and memes shared on social networks are not what they appear to be. Before “liking and sharing” a divisive piece of content, or taking at face value a provocative post in our Facebook feed, we would do well to remember this.