Economists have a method to reduce fake news on social networks

Controlling the spread of misinformation on social media platforms has sparked important conversations about censorship and free speech.

“A tacit assumption has been that censorship, fact-checking, and education are the only tools to fight misinformation,” says David McAdams, an economist at Duke University. In new research published in the journal Proceedings of the National Academy of Sciences, McAdams and his collaborators explore ways to improve the quality of information shared on networks without making any entity responsible for monitoring content and deciding what which is true and false.

The model suggests that to reduce the spread of misinformation, the network can set limits on how wide certain messages can be shared, and do so in a way that isn’t too restrictive for users.

“We show that caps on the number of times messages can be forwarded (network depth) or the number of other people to whom messages can be forwarded (network breadth) increase the relative number of true vs. false messages circulating in a network, regardless of whether the messages are accidentally or deliberately distorted,” says McAdams.

“For example, Twitter could limit the scope of sharing on its site by limiting the number of people who see a given retweet in their Twitter feeds,” he says.

Facebook and WhatsApp, two apps owned by parent company Meta that allow users to message each other, have used methods similar to the researchers’ model to limit the spread of misinformation.

In 2020, Facebook announced limits on the number of people or groups users could forward messages to, capping it at five, in part to combat misinformation about COVID-19 and voting. Earlier that year, WhatsApp introduced similar limits, prohibiting its more than two billion users from forwarding messages to more than five people at a time, in part because of more than a dozen deaths that public officials in India linked to false information that was being spread on the app, the researchers noted.

This approach does not eliminate misinformation, but in the absence of other methods, it can reduce the severity of the problem until other solutions can be developed to get to the heart of the problem, says McAdams.

“When misinformation spreads through a social network, it can cause harm,” says McAdams, who holds professorships in the economics department and the Fuqua School of Business. “Some people might start believing things that are wrong and that can harm them or others.”

It can also cause some people to lose faith in the platform, which means they may be less likely to believe or act on correct information that could actually help them or others, he says. .

“If you limit sharing, you might also limit the spread of good information, so you might throw the baby out with the bathwater and that doesn’t really help you,” McAdams warns. “Our analysis explores how to strike that balance.”

(Matthew Jackson, an economist at Stanford University, and Suraj Malladi, an economist at Cornell University, co-authored the research with McAdams.)

About Deborah Wilson

Check Also

How the media, ex-Seahawks and fans are reacting to Seattle’s win over Russell Wilson, Broncos

The Seahawks held off Russell Wilson and the Broncos for an early-season win over their …