Facebook has a solution for its pesky fake news problem, but it wants its 2 billion users to do most of the work. The social media giant unveiled another major change to the News Feed, announcing it will rank news organisations by trustworthiness based on user feedback, diminishing its own role in influencing what news people see.
The move comes after the company faced severe criticism over the last two years for allowing misinformation and propaganda to spread on its platform and for supposedly favouring liberal news platforms over conservative ones.
This comes two weeks after Facebook had announced a major overhaul to users News Feeds, to emphasize posts, videos and photos shared by friends and family over publishers and brands. This fresh initiative will not change that but will have implications for what news is consumed on Facebook, potentially favouring the established publishers in media while hurting smaller independent news organisations. This was evidenced in the nearly 9% rise in the New York Times’ stock price following the announcement.
In a post on Facebook, the company CEO Mark Zuckerberg explained the rationale behind the decision, “The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with.”
“There’s too much sensationalism, misinformation and polarization in the world today,” Zuckerberg added. “We decided that having the community determine which sources are broadly trusted would be most objective.”
Publishers wary of the move
Several publishers have commented on the move, concerned that crowdsourcing users’ opinions on trustworthiness might be open to manipulation.
“Consumer reviews of products like toasters work because we have direct experience using them. Consumer reviews of news sources don’t work because we can’t personally verify the facts from direct experience,” Alan Dennis, Antino Kim and Tricia Moravec, three independent researchers who have studied the Fake news problem wrote on BuzzFeed News. “Our opinions of news are driven by strong emotional attachments to underlying sociopolitical issues. Put simply, our research shows that we’ll trust anyone to be objective about their kitchen appliances, but when it comes to news, we want experts who can verify the facts.”
“It is absolutely a positive move to start to try to separate the wheat from the chaff in terms of reputation and use brands as proxies for trust,” Jason Kint, the chief executive of Digital Content Next, a trade group that represents entertainment and news organizations, told the The New York Times. “But the devil’s in the details on how they’re going to actually execute on that.” He added, “How does that get hacked or gamed? How do we trust the ranking system? There’s a slew of questions at this point.”
Gizmodo’s Bryan Menegus was more scathing, “Surveys that, if they’re like prior surveys regarding publication trust, are almost certain to conclude that partisan coverage is trusted far more than brands like the Associated Press or Reuters that attempt objectivity. We look forward to the flood of unverified information and bloodbath of insolvent publications that are sure to follow this brilliant pivot.”
Facebook’s News Feed overhaul
Two weeks ago Facebook had said that it would prioritise content that has been shared by friends and family while de-emphasizing content from publishers and businesses. According to the social media giant, the move was designed to encourage people to interact more with the things that they actually do see. The idea was that users are more likely to comment and engage with a post shared by family members or friends over ones shared by a business or publisher.
“Recently we’ve gotten feedback from our community that public content — posts from businesses, brands and media — is crowding out the personal moments that lead us to connect more with each other,” company founder and CEO Mark Zuckerberg had explained in a public post.