Facebook must confront responsibilities of being a media company
SINCE last week’s election, Facebook’s role in policing fake news on its site has become a very hot topic.
And it should be. Throughout the election, Facebook’s behaviour has exposed what seems to be a great contradiction at its heart. As the social network has pushed hard to dominate new forms of media, it’s also bent over backward to deny that it is a media company — and denying that responsibility that comes with that label.
The truth is that Facebook has already taken on one of the functions of a media company: to act as a gatekeeper. It has labelled satire. It takes down “clickbait” articles which, in its own words, have headlines that “intentionally leave out crucial information, or mislead people”. Its algorithms clearly have some standards for content quality.
But Facebook won’t apply those standards to its fake news problem. In fact, Facebook chief executive Mark Zuckerberg took to his own profile to explain why and to reject the much-discussed idea that false news articles on the network could have affected the election. (Zuckerberg said the same thing at a conference last week. But the idea has persisted so strongly that he decided to address it again.)
I don’t know if we can lay credit or blame for this election’s outcome at social media’s doorstep. Finding that out would take a lot of research, an army of sociologists and access to a lot of Facebook data I don’t have.
But what is troubling about Zuckerberg’s post is his explanation for why Facebook isn’t tagging or penalising false news:
“Identifying the ‘truth’ is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.”
I agree it’s not easy for Facebook to tackle this problem. Worries of a politicised Facebook have dogged its steps before. Those accusations are still haunting the network; Facebook in a statement on Monday denied a Gizmodo article claiming it had a solution to its fake news problem but quashed it, fearing backlash from the right.
It’s true that Facebook alone shouldn’t define what is the truth. But its prominence as a source of news gives it the responsibility to flag what is false.
In his post, Zuckerberg essentially falls back on an old excuse when the fake news issue comes up: that Facebook is just technology firm and a platform . . . an aggregator not up to the task of policing its users.
That excuse started out thin, and is only getting thinner.
Sure, social media companies may want to deny that they are in the content business. But they are in it — and only getting deeper. Look at where these companies are investing: more photo-sharing, video, virtual reality. Those are all new media products that they say they need to survive.
Yet Zuckerberg is sticking to his tech company script, denying responsibility for what is posted on Facebook — even when it doesn’t make sense.