wordpress blog stats
Connect with us

Hi, what are you looking for?

Facebook settles class action suit by content moderators for $52 million

Beheading of St John the Baptiste, painting, Caravaggio

In a huge recognition of the emotional toll content moderation takes on the moderators, Facebook has agreed to pay its current and former moderators $52 million as compensation for job-related mental health issues, the Verge reported on May 12. In addition to the compensation, Facebook will also provide counselling services to them at work, as per a preliminary settlement filed in San mateo Superior Court on May 8.

How much is the compensation? Each moderator will be paid at least $1,000 (~₹75,500) that can be spent however they like. They could be eligible for an additional $1,500 (~₹1,13,270) if they are diagnosed with a mental health condition. If moderators receive more than one diagnosis, such as PTSD (post-traumatic stress disorder) and depression, they could be eligible for up to $6,000 (~₹4,50,000). Moderators with qualifying diagnosis could also submit evidence of other injuries suffered due to their time at Facebook and receive up to $50,000 (~₹37,76,000) in damages. However, the exact payout could shrink significantly if majority of the class is found to be eligible for benefits, Verge pointed out.

Who is covered? 11,250 moderators in California, Arizona, Texas and Florida from 2015 until now. According to Verge, as many as half of them could be eligible for extra compensation related to mental health issues, including depression and addition, that are associated with their work for Facebook. A Facebook spokesperson told MediaNama that this settlement currently covers only content moderators in the US. This is the statement Facebook emailed us:

“We are grateful to the people who do this important work to make Facebook a safe environment for everyone. This settlement relates to the U.S. only however, we are committed to providing support for everyone who reviews content for Facebook, as we recognize that reviewing certain types of content can sometimes be difficult. We require all of our partners globally to provide access to extensive support from licensed mental health counselors to ensure their wellbeing, including 24/7 on-site support with trained clinicians, an on-call service, and access to private healthcare from the first day of employment. We are also globally employing new technology to limit their exposure to graphic material as much as possible.” — Facebook spokesperson (emphasis ours)

When will the moderators be paid? Verge reported that members in this suit will have time to comment on the proposed settlement and request changes. A judge will then approve it and that is expected to happen by the end of the year.

What other changes will Facebook have to implement? According to the Verge,

Advertisement. Scroll to continue reading.
  • Facebook will make changes to its content moderation tools — including muting audio by default and changing videos to black and white — to reduce the impact of viewing harmful images and videos. These changes will be rolled out to 80% of the moderators by the end of 2020, and 100% of moderators by 2021.
  • Mental health support: Weekly, one-on-one coaching sessions with licensed medical health professional for moderators who view graphic/disturbing content daily. Moderators who experience a mental health crisis will get access to a licensed counselor within 24 hours. The company will also make monthly group therapy sessions available to moderators.

Does the settlement impose obligations on Facebook’s vendors such as Cognizant, Genpact, Accenture, etc.? As per the Verge, Facebook will need to ensure that its vendors:

  • Screen applicants for “emotional resilience” as part of the recruitment process
  • Post information about psychological support at each moderator’s workstation
  • Inform moderators how they can report violations of Facebook’s workplace standards by the vendors

How did this begin? In September 2018, former Facebook moderator Selena Scola had sued Facebook, and alleged that she had developed PTSD after nine months on the job as her role required her to regularly view photos and videos of rape, murder, beheadings and suicide. Other former Facebook content moderators later joined the case and turned it into a class action suit as per which, Facebook failed to provide them with a safe workplace.

In March 2019, the Verge had reported on the dire conditions in which content moderators employed by Cognizant in Phoenix and Tampa work, and the lack of mental health support provided to them. Few months later, in October 2019, Cognizant shut down its content moderation business altogether.

Cognizant is not alone. In February 2019, Reuters had reported that Genpact employs 1,600 people as content moderators in Hyderabad, and many had complained of the stressful and traumatic nature of their work, and of being underpaid. As a result of increased attention on content moderation, Facebook increased the pay of content moderators in the US in May 2019, and Genpact doubled the minimum salaries for them in India in August 2019.

Mark Zuckerberg, in February 2020, revealed that Facebook is considering external audit of its content moderation systems. Its status currently remains unknown.

Also read: Reliance on automated content takedowns needs to be reconsidered: MediaNama’s take

Update (May 14, 2020 5:27 pm): Updated with response from Facebook. Originally published on May 14 at 4:50 pm.

Advertisement. Scroll to continue reading.
Written By

Send me tips at aditi@medianama.com. Email for Signal/WhatsApp.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

The DSCI's guidelines are patient-centric and act as a data privacy roadmap for healthcare service providers.

News

In this excerpt from the book, the authors focus on personal data and autocracies. One in particular – Russia.  Autocracies always prioritize information control...

News

By Jai Vipra, Senior Resident Fellow at Vidhi Centre for Legal Policy The use of new technology, including facial recognition technology (FRT) by police...

News

By Stella Joseph, Prakhil Mishra, and Yash Desai The Government of India circulated proposed amendments to the Consumer Protection (E-Commerce) Rules, 2020 (“E-Commerce Rules”) which...

News

By Rahul Rai and Shruti Aji Murali A little less than a year since their release, the Consumer Protection (E-commerce) Rules, 2020 is being amended....

You May Also Like

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ