wordpress blog stats
Connect with us

Hi, what are you looking for?

Regulations to check Facebook should focus on safer algorithms, not content removal

The need of the hour is for lawmakers to understand the systems that are amplifying harmful content.

Imagine a biscuit brand shipped harmful biscuits to customers who fell ill after consuming them. There are two options regulators have: They could either check biscuits frequently and ask the brand to recall harmful packets, or frame laws to mandate systemic protections in the company’s distribution process. Which one is more efficient?

Through the IT Rules 2021, the Indian government has created a regulatory infrastructure for content takedowns. Focusing on takedowns, however, is like checking for individual bad biscuits: it’s inefficient and fails to address structural flaws.

The Facebook Papers leaked by Frances Haugen, which I have reported on for the past month, makes it clear that Facebook’s failures in content moderation are systemic, not instantial. The need of the hour is for lawmakers to understand the systems that are amplifying harmful content instead of focusing on taking down individual posts.

Why regulators need to focus on harmful algorithms

The intuitive approach to harmful content: Our intuitive understanding of the ‘bad content’ problem on Facebook is that content reviewers are not doing a good enough job of taking such content down. A criticism often levelled against Facebook is that it doesn’t have nearly enough such reviewers, or more specifically in India, that it is often unwilling to take down content by influential political figures.

Why that’s the wrong approach: While unbiased human oversight over content is crucial, there are other ways at Facebook’s disposal for reducing the spread of hateful content. Innumerable factors go into determining what content is distributed on Facebook and to what extent, and some of those classifiers answer such questions as ‘Is this potentially violative of our standards’ and ‘Is it likely that this content will receive high engagement from X user?’

Advertisement. Scroll to continue reading.

Keeping such metrics in mind, Facebook has set up its algorithms to prioritize certain objectives when distributing content, like increasing user engagement or prioritizing meaningful social interactions. The real power of Facebook, which can often be misused, is in its control of what its algorithms aim to achieve. Regulatory efforts need to zero in on this decision-making power of platforms.

What can be done: The objective of reducing the spread of harmful content on Facebook is achieved more efficiently if algorithms detect potential harm and simply don’t amplify that piece of content. However, there is often a tradeoff between ensuring safety and increasing engagement, and Facebook has little incentive to prioritize safety.

That’s where regulators need to step in to make sure platforms prioritize safety in their algorithms, even at the cost of engagement. The policies that govern algorithms need oversight to ensure that they are not just aimed at making platforms money by increasing engagement, but also prioritize keeping users and societies safe.

Facebook whistleblower Sophie Zhang recently emphasized the same point in a Reddit AMA:

There has been considerable research done within Facebook about actions to change distribution to minimize this distribution [of harmful content], that it has been reported that FB resisted or refused as it would hurt activity in general. If FB wanted to avoid the ongoing genocide in Myanmar, my personal belief is that it could have done so by turning down virality in the country. – Sophie Zhang (emphasis ours)

Indian regulators focus on takedowns, missing the point

Current regulations in India don’t address the policies governing Facebook’s algorithms and focus instead on giving the government the power to dictate content takedowns. Here are the clauses of the IT Rules 2021 on content takedowns:

  • Disabling content within 36 hours of government order: All intermediaries have to remove or disable access to information less than 36 hours of getting a court order or from an appropriate government agency under Section 79 of the IT Act.
  • Voluntary takedowns: All intermediaries will have to take down content that violates any law; defamatory, obscene, pornographic, paedophilic, invasive of privacy, insulting or harassing on gender; content related to money laundering or gambling; or “otherwise inconsistent with or contrary to the laws of India”.
  • Disabling content within 24 hours of user complaint: All intermediaries will have to take down content that exposes a person’s private parties (partial and full nudity); shows any sexual act; is impersonation; is morphed images within 24 hours of individuals (users or victims) reporting it.

Do the rules mention algorithms? The closest the rules come to addressing algorithms is when asking platforms to develop automated tools to identify content depicting rape, child sexual abuse and content that is “exactly identical” to previously removed content.

Such requirements, however, don’t do nearly enough to regulate the algorithms through which platforms distribute content. If we want social media to be safer at large, Indian regulators will need to ensure that these algorithms are geared towards that goal.

Advertisement. Scroll to continue reading.

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Written By

Figuring out subscriptions and growth at MediaNama. Email: nishant@medianama.com

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

Releasing the policy is akin to putting the proverbial 'cart before the horse'.

News

The industry's growth is being weighed down by taxation and legal uncertainty.

News

Due to the scale of regulatory and technical challenges, transparency reporting under the IT Rules has gotten off to a rocky start.

News

Here are possible reasons why Indians are not generating significant IAP revenues despite our download share crossing 30%.

News

This article addresses the legal and practical ambiguities in understanding the complex crypto ecosystem in India.

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ