wordpress blog stats
Connect with us

Hi, what are you looking for?

Facebook settles class action suit by content moderators for $52 million

Beheading of St John the Baptiste, painting, Caravaggio

In a huge recognition of the emotional toll content moderation takes on the moderators, Facebook has agreed to pay its current and former moderators $52 million as compensation for job-related mental health issues, the Verge reported on May 12. In addition to the compensation, Facebook will also provide counselling services to them at work, as per a preliminary settlement filed in San mateo Superior Court on May 8.

How much is the compensation? Each moderator will be paid at least $1,000 (~₹75,500) that can be spent however they like. They could be eligible for an additional $1,500 (~₹1,13,270) if they are diagnosed with a mental health condition. If moderators receive more than one diagnosis, such as PTSD (post-traumatic stress disorder) and depression, they could be eligible for up to $6,000 (~₹4,50,000). Moderators with qualifying diagnosis could also submit evidence of other injuries suffered due to their time at Facebook and receive up to $50,000 (~₹37,76,000) in damages. However, the exact payout could shrink significantly if majority of the class is found to be eligible for benefits, Verge pointed out.

Who is covered? 11,250 moderators in California, Arizona, Texas and Florida from 2015 until now. According to Verge, as many as half of them could be eligible for extra compensation related to mental health issues, including depression and addition, that are associated with their work for Facebook. A Facebook spokesperson told MediaNama that this settlement currently covers only content moderators in the US. This is the statement Facebook emailed us:

“We are grateful to the people who do this important work to make Facebook a safe environment for everyone. This settlement relates to the U.S. only however, we are committed to providing support for everyone who reviews content for Facebook, as we recognize that reviewing certain types of content can sometimes be difficult. We require all of our partners globally to provide access to extensive support from licensed mental health counselors to ensure their wellbeing, including 24/7 on-site support with trained clinicians, an on-call service, and access to private healthcare from the first day of employment. We are also globally employing new technology to limit their exposure to graphic material as much as possible.” — Facebook spokesperson (emphasis ours)

When will the moderators be paid? Verge reported that members in this suit will have time to comment on the proposed settlement and request changes. A judge will then approve it and that is expected to happen by the end of the year.

What other changes will Facebook have to implement? According to the Verge,

  • Facebook will make changes to its content moderation tools — including muting audio by default and changing videos to black and white — to reduce the impact of viewing harmful images and videos. These changes will be rolled out to 80% of the moderators by the end of 2020, and 100% of moderators by 2021.
  • Mental health support: Weekly, one-on-one coaching sessions with licensed medical health professional for moderators who view graphic/disturbing content daily. Moderators who experience a mental health crisis will get access to a licensed counselor within 24 hours. The company will also make monthly group therapy sessions available to moderators.

Does the settlement impose obligations on Facebook’s vendors such as Cognizant, Genpact, Accenture, etc.? As per the Verge, Facebook will need to ensure that its vendors:

  • Screen applicants for “emotional resilience” as part of the recruitment process
  • Post information about psychological support at each moderator’s workstation
  • Inform moderators how they can report violations of Facebook’s workplace standards by the vendors

How did this begin? In September 2018, former Facebook moderator Selena Scola had sued Facebook, and alleged that she had developed PTSD after nine months on the job as her role required her to regularly view photos and videos of rape, murder, beheadings and suicide. Other former Facebook content moderators later joined the case and turned it into a class action suit as per which, Facebook failed to provide them with a safe workplace.

In March 2019, the Verge had reported on the dire conditions in which content moderators employed by Cognizant in Phoenix and Tampa work, and the lack of mental health support provided to them. Few months later, in October 2019, Cognizant shut down its content moderation business altogether.

Cognizant is not alone. In February 2019, Reuters had reported that Genpact employs 1,600 people as content moderators in Hyderabad, and many had complained of the stressful and traumatic nature of their work, and of being underpaid. As a result of increased attention on content moderation, Facebook increased the pay of content moderators in the US in May 2019, and Genpact doubled the minimum salaries for them in India in August 2019.

Mark Zuckerberg, in February 2020, revealed that Facebook is considering external audit of its content moderation systems. Its status currently remains unknown.

Also read: Reliance on automated content takedowns needs to be reconsidered: MediaNama’s take

Update (May 14, 2020 5:27 pm): Updated with response from Facebook. Originally published on May 14 at 4:50 pm.

You May Also Like


Facebook’s Oversight Board will be reviewing, at Facebook’s referral, the decision to indefinitely suspend former US President Donald Trump’s from Facebook and Instagram. Trump’s accounts...


Digital sovereignty also involves the autonomy of Indians as far as ownership of that data is concerned, Ravi Shankar Prasad Union Minister for Electronics...


Mobile numbers and WhatsApp chats of people using WhatsApp web were indexed on Google search results yet again, a security researcher claimed. This came...


Facebook has been cleansing its platform of violent content posted and promoted by political groups that purportedly threaten violence. Since November last year, the...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2018 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to Daily Newsletter

    © 2008-2018 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ