Facebook’s “Fake News” Solution

Written by Christopher Agius


There’s been an abundance of talk lately about Facebook and what, if anything, it will do about its “fake news” problem.

Yesterday, Mark Zuckerberg announced Facebook’s solution—and it’s something of a relief to see how measured it is. Essentially, Facebook will allow users to report posts that may contain misinformation in much the same way they can report spam. If enough people judge the story to be a hoax, it will be sent to a group of fact checkers to determine whether or not that is indeed the case. In the event they believe that it is, the story will be presented to users with a warning of its likely inaccuracy. Users will still be able to read and share the story.

From the beginning, Zuckerberg has rightly warned about the danger of Facebook becoming the arbiter of truth.

One of the most acclaimed changes brought about by the World Wide Web is the democratization of information flow. One no longer needs to go through a ‘gatekeeper’ such as an established media company to get their thoughts and opinions in front of the eyes of the masses. We now have personal websites, blogs, and social media.

This development, like most, has two sides: The good news is, everyone with a computer or mobile device has a platform from which to be heard; the bad news is, everyone with a computer or mobile device has a platform from which to be heard.
With traditional broadcast media, news is curated. This creates a lot less noise overall, but it also makes it much easier to control the narrative. I have no intention of smearing the journalism profession as is fashionable nowadays; the “mainstream media” is a huge umbrella which comprises numerous individual journalists. I find it difficult to believe that not a single one cares about a journalistic code of ethics, about bringing you the raw and gritty truth. However, news organizations are still just that—organizations—with structure and hierarchy and ultimately, a degree of editorial control that means information can still be filtered.

The Web model means that one can potentially be exposed to much greater breadth of information. The problem is that to find the diamond in the rough, one must first wade waist-deep excrement while being assaulted by swarms of armchair experts on a wide variety of topics. There is simply too much noise and nothing in the way of quality checks. Does the person you’re reading know how to present facts clearly? Do they care about the details? Have they any idea of how to properly do research? Are they aware of their own biases? Do they possess a modicum of critical thinking skill? While “fake news” is a problem, we should be at least as concerned about the proliferation of low-quality news.

With all of that said, I don’t think the solution is to revert back to the broadcast model. No one person or organization holds a monopoly on truth. That is not to say, as I have sometimes heard argued, that we shouldn’t bother to discriminate between fact and fiction because information is too murky. That is absurdity. The world as we know it would not continue to function if we did not attempt to find truth and make judgments based on it.

There are also those who find it sad that Facebook should have to do anything at all—that we should rely on critical thinking to filter out the noise. This I agree with, but I cannot help but feel that these people are imagining an ideal world rather than the one that we actually live in. Firstly, even for the most discerning of critical thinkers, one must admit that the sheer volume of noise is exhausting to have to actively filter. Secondly, let’s face it—the critical thinking skills of the general populace do not exactly inspire confidence.

This is precisely why Facebook’s solution makes sense. If a particular article is heavily disputed, and found to be questionable by a panel of independent fact checkers, people should absolutely be made aware of this. This is somewhat similar to the model that Wikipedia already uses, and it works pretty well. Anything that gets people to pause before sharing is a welcome change. However, the final decision on what to read and share is left up to the user, as it should be.

Now, Zuck, we just need to talk about those filter bubbles. But that’s a conversation for another day.

 

Facebooktwitterredditlinkedintumblrmail