The democratic primary is in full swing, and with memories of the 2016 campaign still fresh, disinformation, misinformation and election interference are problems for which the United States must remain vigilant to prevent the hijacking of our electoral process by malicious domestic and foreign actors. But how do we address these problems without taking steps that play right into the hands of the trolls? To start, we must understand the problem from different angles with the help of experts in a variety of disciplines.

To that end, the R Street Institute and Distinguished Senior Fellow Mike Godwin convened a panel on the Hill featuring academics from across the country to explore these issues and share relevant insights with Congress. Held over lunch in a Senate office building just steps from the Capitol, the panel included scholars from top universities who shared their insights on disinformation heading into the 2020 election.

In the course of his introductory remarks, Godwin underscored that each of the speakers would share important insights about both the scope of the problem of disinformation in 2020 and the range of possible solutions to address it. Among other questions, the panel would answer the question of “how good we are at detecting fake news—it turns out we may not be totally great at it.” In addition, Godwin said, the panel would discuss the larger public policy question of whether advertising, which in recent decades has been seen as deserving at least some First Amendment protection, should be considered protected speech to the extent that the U.S. legal culture currently allows. In previewing Professor Kathleen Carley’s work on disinformation at Carnegie Mellon, Godwin underscored that the sources of disinformation in 2020 and beyond may include both “domestic and foreign actors,” and should be distinguished from mere misinformation, which may be attributable to simple human error. Disinformation, in contrast, is “intentionally using aspects of the information ecosystem to mislead people,” he said, adding that “disinformation may actually include true statements.”

After introducing each of the panelists and their topics, Godwin then yielded the mic to Zeve Sanderson, executive director at the Center for Social Media and Politics, who began by raising a fundamental question: How well do we understand the nature of fake news? As Sanderson explained, many assume that because a significant amount of disinformation exists online, it must have an impact on society writ large. Yet this assumption ignores the heterogeneity that exists online. Communities are different, and individuals do not all react the same way, making certain groups more vulnerable to low quality information online.

While it is undoubtedly true that disinformation efforts inject a large aggregate amount of disinformation into the media ecosystem, social science research has called into question whether this has led to widespread changes in political beliefs or political outcomes. That is why Mr. Sanderson calls for researcher access to data that online platforms collect. Much of the relevant data researchers need to access is controlled by private platforms where the data is generated. The results may indicate that the challenges we face are severe, or there may be little cause for concern. But without the relevant information, we are left struggling for solutions based on the wrong problem. To be sure, researcher access to social media data must be done with caution—Cambridge Analytica, after all, worked through a researcher. But done correctly, the insights Mr. Sanderson and others could achieve with the right data could be instrumental to preventing an election crisis.

A second issue on many people’s minds is online advertising, and that was the topic of conversation for Ramsi Woodcock, assistant professor of law at the University of Kentucky. Discussing his recent article in the Yale Law Journal, Prof. Woodcock argued that while advertisements have traditionally provided informative value to consumers, this informative value has waned as the development of the internet has allowed consumers to find relevant information more easily. The only remaining function of most advertising, argued Prof. Woodcock, is to manipulate—to lead people to make decisions or form beliefs they otherwise wouldn’t. This may be appropriate in political advertising, because political discourse is at least in part about moving voters, but it is particularly troubling in commercial advertising, because it undermines the consumer sovereignty that ensures markets produce what consumers want. Prof. Woodcock called on the Federal Trade Commission to look to previous enforcement efforts that identified manipulative advertisements with no informative value and crack down on commercial advertising today. That would force tech giants to reorient their business models away from advertising, since most of their advertising business comes from commercial advertising, rendering the problem of online political advertising moot.

Prof. Woodcock’s theory of anticompetitive advertising is provocative. Although he argues that First Amendment protections do not apply to commercial speech that undermines consumer sovereignty, one wonders whether the Supreme Court would agree. But at a time when 72 percent of the public is concerned about microtargeted campaign ads, provocative thinking is necessary to stimulate conversations that lead to solutions.

Turning to a different facet of the mechanisms of persuasion, Professor Kathleen Carley of Carnegie Mellon University explained that the main goal of malicious actors is often not to spread outright falsehoods for the sake of “tricking” the American public, but to cause division and defeatism among relevant communities. As research from CMU’s IdeaS group indicates, these malicious actors do so by finding anti/pro issues, then propping up more extremist  individuals in the communities that support them, building interaction within the two communities, and then spreading messages that serve to create excitement in one group and despair in the other. The groups impacted in this way often switch from making decisions rationally to making decisions emotionally. In short, these efforts seek to exploit existing divisions by making everything us vs. them.

But that doesn’t mean there’s nothing we can do. To address these issues, Prof. Carley explained, it is critical that we continue to call out fake media we encounter online. Research has shown that identifying and drawing attention to disinformation and misinformation can have a positive effect on the online ecosystem. In this way (and somewhat counterintuitively), satirizing deliberate attempts at misinformation may be a powerful tool to limit its spread.

Alongside the individual solutions proposed by the panelists, a unifying theme emerged: We need to promote digital literacy. Elderly users, who are seven times more likely to share disinformation, are especially vulnerable to low-quality news online, which suggests, as Mr. Sanderson explained, that age can often serve as a proxy for digital literacy. Helping these communities better understand the internet and social media generally can help prevent direct attacks against these communities, make them more resistant to manipulative advertising and allow them to push back against the swath of disinformation distributed online.

Digital literacy is of course not just for social media consumers. Policymakers, including members of Congress and their staffs, must understand the complexities of the online sphere, malicious disinformation and its mechanisms of efficacy, and the science behind strategies for stopping disinformation. Convenings such as our panel serve as a launchpad to facilitate discussions about the extent of the challenges we face and the possible responses.

As we move toward the 2020 presidential election, we must remain calm but vigilant. Overreacting and pushing ourselves further to the emotional extremes of issues will only worsen these problems, but ignoring them will allow for continued chicanery. November is fast approaching, and we need to be ready.

Image credit:  Kaspars Grinvalds

Featured Publications