Fact-Checking Can Reduce the Spread of Unreliable News. It Can Also Do the Opposite.
February 1, 2017 12:43 PM   Subscribe

Fact-Checking Can Reduce the Spread of Unreliable News. It Can Also Do the Opposite.
Moderators of r/worldnews on reddit worked with me to test an idea: what are the effects of encouraging fact-checking on the response and spread of unreliable news? On average, messages encouraging fact-checking caused a 2x reduction in the reddit score of tabloid submissions, which likely influenced reddit's rankings.

Our questions about tabloid news added an algorithmic dimension to a classic governance question: how can people with power work toward the common good while minimizing constraints on individual liberty?

This study is the latest in my PhD work to create CivilServant, software that supports communities to test their own efforts to address social problems online, independently of online platforms.
Role: Designer, Researcher
posted by honest knave (3 comments total) 1 user marked this as a favorite

The headline claims "It can also do the opposite", but I had difficulty finding where in the body this was addressed. Can you point me at it?
posted by JauntyFedora at 2:29 PM on February 1

Hi JauntyFedora, if you scroll down, you'll see the chart of score over time, which shows the effect of our second intervention. When we encouraged people to fact-check *and* vote, the score of a post increased at a greater rate than it would have, had we done nothing at all.
posted by honest knave at 3:25 PM on February 2

Interesting work, Nate—especially in light of the times we are living in. I'm not sure what causes people to *want* to fact check. Does seeing the admin notice change who visits a subreddit and participates in the comment section (whether by voting on the comments or adding their own)? Are these people already prone to this behavior and do they reward others like them?

The effect of the 'downvote' call to action seems bizarre to me.

A case you might find relevant is /r/worldpolitics and its sidebar "rules." The subreddit has long marketed itself as a "free speech" bastion, without heavy moderation (as compared to /r/worldnews or /r/politics). It started off:

No spam

At some point in 2013 that was removed and the sidebar read:

I won't remove anything that isn't against the rules of reddit, so don't bother messaging me.

Mid-2014 the mods change the sidebar to:

Explicitly allowed:
* Editorialized titles
* Feature stories
* Editorials, opinion, analysis
* Raw images, videos or audio clips
* Offensive content

In 2015, they added:

* satire

And in late 2016 they added again:

* Fake News
* Climate change denial
* Holocaust denial

These aren't really rules, per se. The initial 'no spam' didn't bar any of these other things, just as anyone in /r/worldnews was free to fact-check from the get-go. But these messages do seem to prime certain types of participants and participation.

In the earliest days (—2012), /r/politics was decent. Now it's a mix of /r/conspiracy, /pol/, and the racist subreddits Reddit banned in 2015. It's a "free speech" sub (without constraints on individual liberty) in that the moderators don't do anything and you can post what you please, but you'll be downvoted to oblivion if you don't evidence the "right" ideology. So, basically, mob rule.

Do suggestions for conduct just change who participates or do they actually engender change in the participants themselves? Are we just creating new bubbles?

I think bubbles are OK, personally—underrated even!—but I wonder about changing people's minds and behaviors to be more collaborative, kind, community-oriented, etc. when the whole notion of "truth" seems increasingly relative.

Well, I'm eager to see more of what comes of your work and the Coral Project. Thanks for sharing!
posted by waninggibbon at 9:28 PM on February 2

« Older Marching Solidarity Songs...   |   Resiliance, Survival and Hope... Newer »

You are not currently logged in. Log in or create a new account to post comments.