wordpress blog stats
Connect with us

Hi, what are you looking for?

Facebook institutes one-strike policy for ‘Live’ violations; Also, more benefits to American contracted content moderators

A couple of quick policy updates from Facebook:

Facebook will block access to ‘Live’ at single violation

In wake of the Christchurch attack shooting streaming on Facebook Live, the company will now block access to Facebook Live tool for people who violate its serious policies relating to Live. Access will be restricted for a single violation for set periods of time, say 30 days. Users who share a link to a militant group’s statement “with no context” will be immediately blocked from using live for a fixed period. This feature is rolling out in the next few weeks; violaters of Live policies will be restricted from running ads.

Outside of this, Facebook is awarding research grants of a cumulative $7.5 million to thee American universities to study and help detect manipulated media. According to Facebook, this was prompted given that Facebook was unable to detect slight variants of the original Christchurch shooting video, which were circulating the platform for hours after the shooting. “People — not always intentionally — shared edited versions of the video, which made it hard for our systems to detect,” the company said in its blog post. The research is aimed to:

  • Detect manipulated media across images, video and audio, and
  • Distinguish between unwitting posters and adversaries who intentionally manipulate videos and photographs

Facebook to give more pay and benefits to contracted content moderators – in the US for now; what about India?

Facebook will increase the hourly pay rate for thousands of contract workers – including content moderators – across the United States. Base rate will increase from minimum wage of $15 an hour to $18 an hour, with higher raises in American cities with higher costs of living; the hikes are coming by mid-2020. The company said it will explore developing “similar standards” for other countries. The changes comes after increased press scrutiny of the well-being and wages of contract workers, especially content moderators and especially in the US. The Verge reported in February that some American contract moderators experience post-traumatic stress disorder and becoming believers of conspiracy theories.

Some changes Facebook is bringing in for contracted content moderators:

  • Individual and group counselors onsite during all hours of work (and not certain hours of each shift), along with “comprehensive health care benefits”.
  • Moderators can now choose to temporarily blur graphic images before reviewing them so they can control how they want to see disturbing content.
  • Facebook says its working on a tools developed based on feedback from psychologists and content reviewers.
  • The company is rolling out resiliency surveys, biannual audits, and compliance programs which will include unannounced onsite checks, and vendor partner self-reporting.
  • A whistleblower hotline where any contracted worker, including content reviewers can raise concerns directly to Facebook

Facebook said its working to make contracts across global vendors consistent to include “quality-focused incentives, no sub-contracting, overtime and premiums for nightshifts and weekends, and healthcare that meets the standards of the Affordable Care Act in the US and appropriate healthcare standards internationally.”

Contracted content moderators in India paid Rs 100,000 a year; complain of trauma

Reuters reported in February that contracted (to Genpact) content moderators in India were paid $6 a day, or about Rs 100,000 annually. Seven content reviewers at Genpact said that their work was underpaid, stressful and sometimes traumatic. This, the report said, was in contrast with the picture Facebook executives had painted of a carefully selected, skilled workforce that is paid well and has the tools to handle a difficult job.

  • The Genpact unit in Hyderabad reviews posts in Indian languages, Arabic, English and some Afghan and Asian tribal dialects, according to Facebook.
  • Moderators watch and review nudity and explicit porn, the counter-terrorism team watches videos including of beheadings, car bombings and electric shock torture sessions
  • The “self-harm” unit regularly watch live videos of suicide attempts – and do not always succeed in alerting authorities in time

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



By Rahul Rai and Shruti Aji Murali A little less than a year since their release, the Consumer Protection (E-commerce) Rules, 2020 is being amended....


By Anand Venkatanarayanan                         There has been enough commentary about the Indian IT...


By Rahul Rai and Shruti Aji Murali The Indian antitrust regulator, the Competition Commission of India (CCI) has a little more than a decade...


By Stella Joseph, Prakhil Mishra, and Surabhi Prabhudesai The recent difference of opinions between the Government and Twitter brings to fore the increasing scrutiny...


This article is being posted here courtesy of The Wire, where it was originally published on June 17.  By Saksham Singh The St Petersburg paradox,...

You May Also Like


Apart from deliberating on ways to address the plight of delivery workers, the meeting also touched on the effects of climate change and poor...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ