by Surabhi Nijhawan
Facebook has decided to get rid of its red flags – the flags under which users could highlight disputed articles – and instead will be rolling out ‘related articles’ as it continues its fight against fake news. The company said, ‘putting a strong image, like a red flag, next to an article may entrench deeply held beliefs – the opposite effect of what we intended’.
The company said that it is adding Related Articles to add more context to existing stories and argues that their research has shown it to be a more efficient way to provide factual information. ‘We’ve found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown’ read the blog post.
Facebook launched the dispute flags feature over a year ago. The reason it is scraped are as follows:
- Disputed flags could not help in understanding what was wrong with a piece of information. It was not a user-friendly feature and required extra clicks for people to understand what the fact checkers had said about the story.
- The function sometimes backfired. Simply marking something as ‘false’ did not necessarily change opinions about its accuracy.
- The disputed flags tools required two fact-checkers. The red flags were applied only when a minimum of two fact-checking party dismissed the information. It was problematic in countries with very few fact-checkers.
- Disputed flags just worked for false ratings: The fact-checkers provided ratings. For instance, ‘false’, ‘partly false’ etc, and despite that users wanted more context on the information. Only giving a rating was not satisfactory.
In December 2016, Facebook was criticised for spreading misinformation on its platform during 2016 US Presidential elections. As a result, the social media giant had launched several tools to fix this problem, that included:
1. Created a feature to allow people to report stories that are untrue.
2. Facebook collaborated with fact-checking organisations to recognise content that might be false
3. The company reduced the spread of articles considered disputed by fact-checkers.
4. Several features were created to alert users when they were sharing a material that could be factually incorrect.
Facebook users will now see badges on their News Feed to spot articles reviewed by fact-checkers. Facebook argues that it is improving their methods to fight Fake News and this new feature will help reduce spammers traffic by 80%.
Will related articles work?
Critics, on the other hand, believe that simply providing related articles is not enough. Tim Luckhurst, professor of Journalism at the University of Kent told BBC – ‘simply offering people a menu of related articles is not enough. Facebook must identify itself as a publisher and not a platform, and be regulated for spreading misinformation.’
Additionally, it is not clear how efficiently related articles would fix the spread of misinformation. It might just lead to a more information clutter, a problem that caused the fake news problem in the first place.
Previously, The Guardian had observed that fact-checking was a time-consuming process and if one is late at it then it may backfire. Now that Facebook plans to focus on related articles, the reviewing of articles will become difficult.
Fake news of Whatsapp
While the social media giant has addressed the existence of false news stories on Facebook, curbing the same on Whatsapp is more complicated. In India, Whatsapp is a big source of rumours and misinformation. Due to end-to-end encryption, spotting fake news by the company is cumbersome.
Meanwhile, in a new beta version which is being rolled out, Whatsapp is giving more power to administrators of groups, which might stem the spread of fake news and rumours. An admin can restrict all members (except the admins) from sending messages to the group. The group will then be in “announcement mode” with only admins able to converse. In a restricted group, the user will have to message the admin, which will then have the power to approve messages; that is, the messages will be moderated.