Yesterday Facebook quietly announced that it is updating the News Feed in order to show fewer hoax stories — a subject that has become a somewhat regular theme on this blog, partially due to Facebook's ever growing capacity to propel false science news into millions of people's news feeds. Now, if you hide an item from your news feed you will have the option to report it as a false news story:
If lots of people report the story as false, the story will begin to show up less in other people's news feeds and when it does show up it will be flagged as potentially containing false information:
According to Facebook, early tests indicate that the update is unlikely to affect genuine satire - which I've previously defended against a similar Facebook update which attempted to label posts containing satire. If the update leaves a bad taste in your mouth, you are right to have concerns about censorship — there is certainly potential for abuse. There is however a strong case for the update — hoax news stories have a tendency to massively outperform articles debunking hoaxes. This is a continual cause of frustration to me, only yesterday I debunked the outlandish claim that transplant patients take on their donors' personalities - the Facebook shares for my article remain in double digits, while the article I'm debunking has ten thousand shares and continues to be shared three years after publication. Last week I debunked the false claim that half of all children will be autistic by 2025, my piece has been shared just over a thousand times — a drop in the ocean compared to the hundred thousand shares each of the two articles it debunks have acquired.
I'm hoping Facebook's new update will allow me to finally stop being distracted by the continual stream of misleading viral science news stories — which believe it or not I really do find excruciatingly boring — and return my attention to what really gets me excited — genuinely interesting scientific research. Another good argument for this update is that debunking hoax news stories as I do, can actually be counterproductive. A growing body of research shows that when you debunk a false claim using evidence you can in fact leave believers of the false claim more likely to believe the claim than if you did nothing at all, unless you are very careful. If there is a good way to halt the spread of the misinformation at source, this option may be far superior to the cat and mouse game of scrambling to debunk the information after the event. I sincerely hope appropriate safeguards have been put in place so that views that are simply controversial don't get caught in the crossfire. It sounds like Facebook is indeed taking this responsibility seriously and acting to ensure posts that are flagged as hoaxes aren't merely unpopular by looking for indicators that might give away bona fide hoaxes:
"People often share these hoaxes and later decide to delete their original posts after they realize they have been tricked. These types of posts also tend to receive lots of comments from friends letting people know this is a hoax, and comments containing links to hoax-busting websites. In fact, our testing found people are two times more likely to delete these types of posts after receiving such a comment from a friend."
Footnote: Recently some Facebook pages have been hit by a scam that fooled page owners into informing their followers that they were being shut down by Facebook after their pages had been supposedly reported for abuse. It later emerged that this threat was in fact the result of a hoax itself, so watch out - this particular hoax is only going to become more virulent now that its narrative has a plausible basis in reality.
Image Credit: Shutterstock, Facebook