wordpress blog stats
Connect with us

Hi, what are you looking for?

Summary: Lawsuit by Rohingya refugees alleges Facebook algorithms led to Myanmar genocide

By pointing the finger at Facebook’s algorithms, this complaint goes further than any other of its kind.

“Facebook was willing to trade the lives of the Rohingya people for better market penetration in a small country in Southeast Asia,” said a class action complaint filed on behalf of Rohingya refugees holding Facebook responsible for the Myanmar genocide of Rohingyas.

The complainants have sued Facebook for US$150 billion in California. In the class-action complaint seen by MediaNama, they alleged that Facebook refused to stop hateful anti-Rohingya content on its platform, instead benefiting from increased user engagement and growth through such content.

By pointing the finger at Facebook’s algorithms, this complaint goes further than any other in ascribing responsibility to the platform for amplifying hateful content. If upheld, it could pave the way for similar action against the company for violent events across the world, including India.

Part 1 – The Defective Design of The Facebook Algorithm

The complaint, filed by a Rohingya woman based in the United States on behalf of more than 10,000 refugees settled in the U.S., outlines in great detail how Facebook’s algorithms geared towards hateful content:

  • Facebook runs on a logic of engagement: The complaint quoted a study published in Nature claiming that engagement was the core logic of Facebook’s news feed. ““[T]he Feed’s … logics can be understood through a design decision to elevate and amplify ‘engaging’ content…. [T]he core logic of engagement remains baked into the design of the Feed at a deep level,” the complaint quoted.
  • Hateful content is most engaging, and hence prioritised: Facebook is aware that hateful content is most engaging and hence prioritises it, the complaint alleged. To substantiate this point, the complaint quoted Tim Kendall, Facebook’s first director of monetisation:

    Tobacco companies then added ammonia to cigarettes to increase the speed with which nicotine traveled to the brain. Facebook’s ability to deliver this incendiary content to the right person, at the right time, in the exact right way—through their algorithms—that is their ammonia. And we now know it fosters tribalism and division. – Tim Kendall

  • Facebook promotes extremist groups: Facebook’s algorithms curate and promote content that attracts new members to extremist groups, the complaint said. It cites a presentation by a Facebook researcher, which suggested that 67% of all extremist group-joins in Germany were a result of Facebook recommending them to users.
  • Manipulation by autocrats: Despite knowing that autocratic politicians used Facebook to generate fake engagement, the company failed to take action, the complaint highlighted citing whistleblower Sophie Zhang, who has made several allegations about Facebook’s unwillingness to take down such fake networks.
  • Facebook incites users to violence: The complaint goes as far as to allege that Facebook ‘radicalizes users and incites them to violence.’ As an example, the complaint mentioned a gunman in New Zealand, who shot 51 people in two mosques while live streaming the event on Facebook. The shooter was discovered to have been active on an extremist Facebook group for two years prior to the attack.

Part 2 – A crisis of digital literacy in Burma

The complaint placed Facebook’s role in a context of low digital literacy, where most residents of Burma had little understanding of how to perceive online content:

  • Rapid growth in Burma: Facebook started operating in Burma in 2010, and allowed Burmese users to use the app without incurring any data charges to increase popularity. For a majority of Burma’s 20 million internet users, ‘Facebook is the internet,’ the complaint quoted from a BBC article.
  • No digital literacy: The transition to an internet-connected society in Burma caused ‘a crisis of digital literacy’ because most users did not understand how to ‘make judgements on online content’, the complaint highlighted, adding a quote from the New York Times on Facebook’s role in this context:

    “[A]s Facebook’s presence in Myanmar grew …, the company did not address what the BSR report calls ‘a crisis of digital literacy’ in a country that was just emerging from a military dictatorship and where the internet was still new.”

Part 3 – Fear, Hate, and the Myanmar Military

Building on the previous two points, the complaint alleged that Facebook spread hate against Rohingya in Myanmar, furthering the Myanmar military’s agenda which eventually led to genocide:

Advertisement. Scroll to continue reading.
  • Violence-inciting content against Rohingyas: Over a 5 year period from 2012-2017, content encouraging violence against Rohingyas flourished on Facebook, the complaint said. To establish this, the lawsuit cited multiple reports from the UN, New York Times, and Reuters. An NYT story mentioned in the complaint reported that the Myanmar military had operated fake accounts to post anti-Rohingya content.
  • Facebook’s role in the ‘clearance operations’: 7,25,000 Rohingya refugees were displaced as a result of the ‘clearance operations,’ the Myanmar military’s ethnic cleansing campaign. Facebook was used to further a hate campaign against the Rohingya, which created a ‘conducive environment’ for state-led violence in Myanmar, the complaint quoted the UNHRC report as saying. It also mentioned that the military commander-in-chief used Facebook to justify the clearance operations in a post, which had almost 10,000 reactions before it was taken down.
  • Participation of radicalised civilians: After earlier arguing that Facebook radicalises users into violence, the complaint alleged that these radicalised users participated in the ethnic cleansing:

    “The radicalization of the Burmese population, to which Facebook materially contributed, did not merely ensure tolerance of and support for the military’s campaign of genocide against the Rohingya, it also allowed the military to recruit, equip, and train “civilian death squads” that would actively participate in the atrocities.” – Class-action complaint (emphasis ours)

  • Ignoring hate speech complaints: The complaint also alleged that Facebook dropped the ball on regulating hate speech in Burmese. Various stakeholders are cited in the complaint who claim that Facebook was entirely unprepared to regulate hate speech in Burmese. A Reuters journalist recounted reporting on the issue:

    “When I sent it to Facebook, I put a warning on the email saying I just want you to know these are very disturbing things…. What was so remarkable was that [some of] this had been on Facebook for five years and it wasn’t until we notified them in August [of 2018] that it was removed.” – Steve Stecklow, Reuters journalist

  • Lack of content reviewers: The complainants also highlighted that when Facebook launched in Myanmar in 2010, the company had only one content reviewer who spoke Burmese till 2015, when the company hired one more. Similar concerns regarding Facebook’s lack of investment in hate speech in local languages beyond English were also recently highlighted by whistleblower Frances Haugen.

In response to MediaNama’s queries, a Meta spokesperson highlighted the company’s efforts in Myanmar:

“We’re appalled by the crimes committed against the Rohingya people in Myanmar. We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw, disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content. This work is guided by feedback from experts, civil society organizations and independent reports, including the UN Fact-Finding Mission on Myanmar’s findings and the independent Human Rights Impact Assessment we commissioned and released in 2018.” – Meta Spokesperson

What does the complaint ask for?

  • Scope of the class: The class action complaint seeks to include “all Rohingya who left Burma (Myanmar) on or after June 1, 2012, and arrived in the United States under refugee status, or who sought asylum protection, and now reside in the United States.”
  • Declaration: The complaint has asked the court to declare Facebook is strictly liable for defects in its algorithms and system, and that Facebook acted negligibly in this instance.
  • Compensation: The complainant has demanded compensatory damages of at least $150 billion to the class for “wrongful death, personal injury, pain and suffering, emotional distress, and loss of property,” and any litigation charges incurred by the plaintiff.

Facebook’s content moderation failures in India

Facebook’s problems in Myanmar are indicative of a larger failure in ensuring safety on its platform in countries outside the U.S. In India, Facebook has repeatedly failed to curb hate speech and misinformation:

  • RSS, Bajrang Dal: Leaked documents showed in October this year that Facebook’s internal researchers flagged anti-Muslim content by the Rashtriya Swayamsevak Sangh and the Bajrang Dal. The researchers specifically listed the Bajrang Dal for takedown, but the organisation’s pages remain live on Facebook.
    • No reason to remove Bajrang Dal: In December 2020, Facebook was questioned by the Parliamentary Standing Committee on IT regarding the allegations. Ajit Mohan, head of Facebook India, told the panel that the company has no reason to act against or take down content from Bajrang Dal.
  • Telangana: Inflammatory posts by Raja Singh, a BJP MLA from Telangana, were left on the platform despite being marked as hate speech, WSJ reported in August 2020. In his posts, Singh had said that Rohingya Muslim immigrants should be shot, called Muslims traitors, and threatened to raze mosques.
  • Assam: Facebook flagged accounts of BJP politicians posting inflammatory content in Assam ahead of the Assam elections, but did not take them down. They also did not remove a hateful post by Shiladitya Dev, a BJP MLA from Assam, for nearly a year, TIME reported in August 2020. Dev had shared a news report about a girl allegedly being drugged and raped by a Muslim man. He said this was how Bangladeshi Muslims target the “native people.”

Update (1:57 pm, 9 December 2021): Added response from Meta spokesperson. 

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Written By

Reporter at MediaNama. Email: nishant@medianama.com

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

While the market reality of popular crypto-assets like Bitcoin may undergo little change, the same can't be said for stablecoins.

News

Bringing transactions related to crypto-assets within the tax net could make matters less fuzzy.

News

Loopholes in FEMA and the decentralised nature of crypto-assets point to a need for effective regulations.

News

The need of the hour is for lawmakers to understand the systems that are amplifying harmful content.

News

For drone delivery to become a reality, a permissive regulatory regime is a prerequisite.

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ