Skip to main content
Support
Blog post

Stop Playing Whack-a-Troll: Building Resilience to Disinformation

Headshot of Morgan Livingston

The disinformation threat to the upcoming 2020 election and the infodemic surrounding COVID-19 give urgency to securing the integrity of the U.S. digital information space. Expert panelists urged for national self-reflection, to examine why the populace is susceptible to disinformation, where regional understanding can bolster foresight, and what role platform policies play.

“People need to be at the heart of the response to disinformation,” said Nina Jankowicz, Disinformation Fellow at the Wilson Center's Science and Technology Innovation Program. During the recent book launch event hosted virtually by the Wilson Center, Jankowicz delved into a strategy for United States resilience to disinformation with Asha Rangappa, Senior Lecturer at Yale University's Jackson Institute for Global Affairs and Matthew Rojansky, Director of the Wilson Center's Kennan Institute.

The disinformation threat to the upcoming 2020 election and the infodemic surrounding COVID-19 give urgency to securing the integrity of the U.S. digital information space. But as the panelists showed, immunizing against disinformation is not as simple as playing “whack-a-troll” with emerging threats. Instead, the panelists urged a national self-reflection, to examine why the populace is susceptible to disinformation, where regional understanding can bolster foresight, and what role platform policies play. The path to resiliency against disinformation is to build a sustainable information ecosystem, and fronting this defense, the panelists propose, is an educated populace, prepared for the digital space and supported by a strong media. 

Look to Our Own Vulnerabilities

Disinformation is a nuanced form of hybrid warfare that acts with a complexity outside Americans’ perception, the panel agreed. “It’s not necessarily about changing votes; it’s about distracting us and creating that discord,” Jankowicz cautioned. "Often Americans talk about fake news as if it is stuff that is just purely cut-and-dry fake….The best disinformation is grounded in real, visceral feelings, and the most successful operations use these homegrown actors in order to get them out there."

Image removed.

Image removed.

Image removed.

Weaponizing culture is a key component of disinformation. The panel drew on examples -- from pineapple pizza to historical case studies -- of how key points of cultural dissent have been used as part of strategic disinformation campaigns. “Understanding your enemy at the cultural level is the poor man’s warfare,” said Rangappa, and Russia is adept at “knowing its enemy [and] knowing which cleavages to tweak, especially with the US.”

“This is not America’s first rodeo with disinformation coming from Russia. This was the KGB’s MO," Rangappa argued. Highlighting the analysis from Jankowicz's book, she noted, "methods have been practiced, refined, and incorporated into new technologies. Basically, as we stood by, they’ve been practicing."

Complicating defense against disinformation is the homegrown element. These tactics, Jankowicz cautioned, are not only being used by foreign actors, "We cannot fight disinformation coming from abroad -- and now it’s coming not only from Russia, but China, Iran, and Venezuela -- if we are creating it and using it on our own people. That is the biggest warning for me as we head into this election cycle."

Role of Platforms

The effectiveness of disinformation is amplified when our own cultural vulnerabilities are compounded with the design of digital platforms. Rangappa noted, “I think Americans are very naive about the idea of information as a weapon…. We have been conditioned as Americans to think of speech and information as a net positive. [With] the marketplace of ideas, the way that you combat bad speech is with good speech. We haven’t fully understood how the marketplace of ideas doesn’t necessarily translate into the digital space.”

Jankowicz has recently been analyzing the impact of platform company policies on the flow of disinformation, examining how even features implemented for privacy and community can intensify disinformation. She remains doubtful of the leadership and incentives in this space, a point that was echoed by Rangappa, "The economic model that these media platforms are built on are not incentivised to encourage digital literacy. They want you to be addicted. They want you to get the most extreme content because that creates more clicks and it makes them more money… I think they're basically the equivalent of tobacco.”

Path to Resilience

The speakers agreed that the way forward is through understanding this hybrid, asymmetric warfare and building resilience.  “What's changed about [fighting disinformation] today is the tools and tactics and speed at which the info spreads. Part of this is not only building resilience but we have to get the regulatory framework in place so that we can respond more effectively," Jankowicz urged, "We are discussing how to bring about positive, democratic-based social media regulation, and it's an area in which the United States is abdicating its leadership right now.” 

“The way that you neutralize disinformation is through exposure," said Rangappa, “It’s about equipping the population.” Tackling disinformation is a pressing challenge and achieving resilience may require a new people-centered approach. “Tech platforms, governments, journalists -- none of them can fact check their way out of the crisis of truth and trust that we face,” said Jankowicz. “But if we educate our citizens, and we repair the cracks in our democracies that allowed troll farms to influence them in the first place, we might have a shot at averting disaster.”

Successful approaches not only address the human element, but the holistic information ecosystem, as Jancowicz’s research shows. “All the countries that have a somewhat successful response [to disinformation]...address the people’s participation in this equation: they address education, they address journalism and the media as a public good, and they’re investing in long term generational solutions to help people navigate the information environment, the information ecosystem that is rapidly degrading, rather than playing whack-a-troll and trying to eliminate fake accounts and bad actors online.”

About the Author

Headshot of Morgan Livingston

Morgan Livingston

Research Assistant, Science and Technology Innovation Program
Read More

Science and Technology Innovation Program

The Science and Technology Innovation Program (STIP) brings foresight to the frontier. Our experts explore emerging technologies through vital conversations, making science policy accessible to everyone.  Read more