Facebook yesterday announced a slew of measures to address fake news, misinformation, group admins responsibility, and Newsfeed designs to curb misinformation. Its worth noting that the measures were announced immediately after Facebook faced the US Congress for its role in the spread of the livestream of the Christchurch mass murder in New Zealand, and also a day before India went to polls. Facebook, however, has not made a mention of WhatsApp in the changes its made. Here are some changes Facebook has introduced across

Groups

  1. Accounting for group admin behaviour: Taking a more punitive approach toward admins, Facebook will now hold group admins more responsible for community standards violations. It will start considering admin and moderator content violations, and member posts that admins have approved, when deciding whether or not to remove a group.
  2. Group quality: A feature Facebook will introduce to give admins an idea of the actions its taken against the group. It offers an overview of removed content and violations flagged in the past, and a fake news section flagged to give admins clearer idea of Facebook’s community standards.
  3. Facebook will reduce the News Feed distribution of Groups which repeatedly share misinformation (that has been rated false by fact-checkers)
  4. To give people more control over their content/posts, Facebook will now allow people to delete their posts and comments from a group after they’ve left it.

Fact-checking and misinformation

  • Dealing with misinformation scale: Acknowledging that manual fact-checking is tedious and cannot scale, Facebook said it will consult academics, fact-checking experts, journalists, survey researchers and civil society organizations to figure out a way to address the scale of the misinformation problem.
  • Facebook said its been exploring since 2017 on how to involve users in fact-checking; users can point to journalistic sources to corroborate or contradict claims made in potentially false content. Facebook said it also needs to find solutions that support original reporting, promote trusted information, complement our existing fact-checking programs.
  • Fact-checking videos: Associated Press will now begun debunking fake news videos and Spanish language content appearing on Facebook is the US starting yesterday.
  • Facebook will work on combating misinformation in vulnerable topics like health and finance this year.
  • Borderline content: Its also working on understanding content which do not violate its standards but are sensationalist, provocative, or problematic in any other way.

NewsFeed

  • The Click-Gap signal: Facebook is introducing a new metric to the News Feed – Click-Gap, an algorithm to determine where to rank a given post. Click-Gap scans the web to check if domains shared on Facebook have an equivalent reach on the web. The feature is meant to filter out domains designed to appeal to Facebook’s NewsFeed and algorithms. If the tool finds that a ton of links to a certain website are appearing on Facebook, but fewer websites on the broader web are linking to the site, Facebook will use that signal to limit the website’s reach.
  • Context on images: Facebook’s ‘Context Button’ which provides people more background information about the publishers and articles on posts. It expand this feature last month to images which have been reviewed by the third-party fact-checkers only for English and Spanish content. The button will have details of the publication’s fact-checking practices, ethics, ownership, funding, and so on.
  • Reducing content that is broadly disliked: Facebook said it will act against low quality content such as pages that have broken links, load slowly, or are otherwise difficult to use. . To identify such issuses, it will user signals such as whether the domain’s Facebook traffic is much higher than that on the broader web.
  • Facebook said it is surveying some people of whether the posts they see “are worth their time” so it can understand who wants to more or less of what content.

Publishers

  • Facebook said it is continuing to act against pages which repurpose content from other sources. It will demonetize content from such publishers and also reduce distribution of video content compiled and posted from third-party content creators.
  • Publishers that are repeat offenders will be penalized with quality demotions.