Facebook said it is rolling out a feature that provides more information on both publishers and articles that users see in their News Feeds. The new feature which saw limited testing last year is now getting a larger rollout, though it is only available to users in the US at the moment.

The new update, first tested last year, gives users more context for news posts. By tapping on an “i” icon next to the story’s headline, you’ll be shown the source’s Wikipedia article, related articles, and a map showing where the story has been shared before — and by whom. This means that if you come across a story which you think is inappropriate or straight up false you will be able to see which of your friends has already shared it.

This latest effort to combat fake news on its platform comes after previous measures have either failed or backfired. TechCrunch reports that Facebook’s partnerships with outside fact checkers that saw red Disputed flags added to debunked articles actually backfired. Those sympathetic to the false narrative apparently saw the red flag as a badge of honour, clicking and sharing anyway rather than allowing someone else to tell them they’re wrong. This new feature never confronts users directly about whether an article, publisher or author is propagating fake news. Instead, Facebook seems to be trying to build a wall of evidence as to whether a source is reputable or not.

Alongside these new features, Facebook is kicking off another “small” test in the US: how much does a writer’s byline impact the article’s credibility? “People in this test will be able to tap an author’s name in Instant Articles to see additional information, including a description from the author’s Wikipedia entry, a button to follow their Page or Profile, and other recent articles they’ve published.”

Medianama’s take: While Facebook’s attempts at dealing with fake news are admirable and they do seem to be doing more than other social platforms like Twitter and Youtube to deal with this menace. But there are some flaws with this method. Firstly Wikipedia entries are not sacrosanct, they can be vandalised, subtly edited or present false information, Wikipedia itself should never be a source of citation. It’s unclear whether Wikipedia has been made aware of this design change. When Youtube announced that it’s planning to integrate Wikipedia articles to offer context to its videos last month, the Wikimedia foundation responded saying they hadn’t been consulted. Youtube’s announcement had garnered criticism from commentators and Wikimedia volunteers. Wikimedia community member Phoebe Ayers had tweeted, “Does linking result in increased traffic? Increased vandalism? It’s not polite to treat Wikipedia like an endlessly renewable resource with infinite free labor; what’s the impact?”

Facebook’s News Feed overhaul

In January, Facebook had said that it would prioritise content that has been shared by friends and family while de-emphasizing content from publishers and businesses. According to the social media giant, the move was designed to encourage people to interact more with the things that they actually do see. The idea was that users are more likely to comment and engage with a post shared by family members or friends over ones shared by a business or publisher.

“Recently we’ve gotten feedback from our community that public content — posts from businesses, brands and media — is crowding out the personal moments that lead us to connect more with each other,” company founder and CEO Mark Zuckerberg had explained in a public post.

Zuckerberg conceded that the change will mean that time spent the platform by users will go down. He added, “but I also expect the time you do spend on Facebook will be more valuable. And if we do the right thing, I believe that will be good for our community and our business over the long term too.”