wordpress blog stats
Connect with us

Hi, what are you looking for?

Content removal on Facebook – a case of privatised censorship: CIS India

CIS Logoby Jessamine Mathew

Any activity on Facebook, be it creating an account, posting a picture or status update or creating a group or page, is bound by Facebook’s Terms of Service and Community Guidelines. These contain a list of content that is prohibited from being published on Facebook which ranges from hate speech to pornography to violation of privacy.

Facebook removes content largely on the basis of requests either by the government or by other users. The Help section of Facebook deals with warnings and blocking of content. It says that Facebook only removes content that violates Community Guidelines and not everything that has been reported.

I conducted an experiment to primarily look at Facebook’s process of content removal and also to analyse what kind of content they actually remove.

  1. I put up a status which contained personal information of a person on my Friend List (the information was false). I then asked several people (including the person about whom the status was made) to report the status — that of  being harassed  or for violation of  privacy rights. Seven people reported the status. Within half an hour of the reports being made, I received the following notification:
    “Someone reported your post for containing harassment and 1 other reason.”The notification also contained the option to delete my post and said that Facebook would look into whether it violated their Community Guidelines.A day later, all those who had reported the status received notifications stating the following:”We reviewed the post you reported for harassment and found it doesn’t violate our Community Standards.”I received a similar notification as well.
  2. I, along with around thirteen others, reported a Facebook page which contained pictures of my friend and a few other women with lewd captions in various regional languages. We reported the group for harassment and bullying and also for humiliating someone we knew. The report was made on 24 March, 2014. On 30 April, 2014, I received a notification stating the following:”We reviewed the page you reported for harassment and found it doesn’t violate our Community Standards.Note: If you have an issue with something on the Page, make sure you report the content (e.g. a photo), not the entire Page. That way, your report will be more accurately reviewed.”I then reported each picture on the page for harassment and received a series of notifications on 5 May, 2014 which stated the following:”We reviewed the photo you reported for harassment and found it doesn’t violate our Community Standards.”

These incidents are in stark contrast with repeated attempts by Facebook to remove content which it finds objectionable. In 2013, a homosexual man’s picture protesting against the Supreme Court judgment in December was taken down. In 2012, Facebookremoved artwork by a French artist which featured a nude woman.  In the same year, Facebook removed photographs of a child who was born with defect and banned the mother from accessing Facebook completely. Facebook also removed a picture of a breast cancer survivor who posted a picture of a tattoo that she had following her mastectomy. Following this, however, Facebook issued an apology and stated that mastectomy photographs are not in violation of their Content Guidelines. Even in the sphere of political discourse and dissent, Facebook has cowered under government pressure and removed pages and content, as evidenced by the ban on the progressive Pakistani band Laal’s Facebook page and other anti-Taliban pages. Following much social media outrage, Facebook soon revoked this ban. These are just a few examples of how harmless content has been taken down by Facebook, in a biased exercise of its powers.

After incidents of content removal have been made public through news reports and complaints, Facebook often apologises for removing content and issues statements that the removal was an “error.” In some cases, they edit their policies to address specific kinds of content after a takedown (like the reversal of the breastfeeding ban).

On the other hand, however, Facebook is notorious for refusing to take down content that is actually objectionable, partially evidenced by my own experiences listed above. There have been complaints about Facebook’s refusal to remove misogynistic content which glorifies rape and domestic violence through a series of violent images and jokes. One such page was removed finally, not because of the content but because the administrators had used fake profiles. When asked, a spokesperson said that censorship “was not the solution to bad online behaviour or offensive beliefs.” While this may be true, the question that needs answering is why Facebook decides to draw these lines only when it comes to certain kinds of ‘objectionable’ content and not others.

Advertisement. Scroll to continue reading.

All of these examples represent a certain kind of arbitrariness on the part of Facebook’s censorship policies. It seems that Facebook is far more concerned with removing content that will cause supposed public or governmental outrage or defy some internal morality code, rather than protecting the rights of those who may be harmed due to such content, as their Statement of Policies so clearly spells out.

There are many aspects of the review and takedown process that are hazy, like who exactly reviews the content that is reported and what standards they are made to employ. In 2012, it was revealed that Facebook outsourced its content reviews to oDesk and provided the reviewers with a 17-page manual which listed what kind of content was appropriate and what was not. A bare reading of the leaked document gives one a sense of Facebook’s aversion to sex and nudity and its neglect of other harm-inducing content like harassment through misuse of content that is posted and what is categorised as hate speech.

In the process of monitoring the acceptability of content, Facebook takes upon itself the role of a private censor with absolutely no accountability or transparency in its working. A Reporting Guide was published to increase transparency in its content review procedures. The Guide reveals that Facebook provides for an option where the reportee can appeal the decision to remove content in “some cases.” However, the lack of clarity on what these cases are or what the appeal process is frustrates the existence of this provision as it can be misused. Additionally, Facebook reserves the right to remove content with or without notice depending upon the severity of the violation. There is no mention of how severe is severe enough to warrant uninformed content removal. In most of the above cases, the user was not notified that their content was found offensive and would be liable for takedown. Although Facebook publishes a transparency report, it only contains a record of takedowns following government requests and not those by private users of Facebook. The unbridled nature of the power that Facebook has over our personal content, despite clearly stating that all content posted is the user’s alone, threatens the freedom of expression on the site. A proper implementation of the policies that Facebook claims to employ is required along with a systematic record of the procedure that is used to remove content that is in consonance with natural justice.

This post was published on Centre for Internet & Society, India website

The Centre for Internet and Society is a non-profit research organization that works on policy issues relating to freedom of expression, privacy, accessibility for persons with disabilities, access to knowledge and IPR reform, and openness (including open government, FOSS, open standards, etc.), and engages in academic research on digital natives and digital humanities.

Advertisement. Scroll to continue reading.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

Is it safe to consider all "publicly available data" as public?

News

PhonePe launched an e-commerce buyer app for ONDC called Pincode. We, however, believe that it should also launch a seller app.

News

Amazon announced that it will integrate its logistics network and SmartCommerce services with the Open Network for Digital Commerce (ONDC).

News

India's smartphone operating system BharOS has received much buzz in the media lately, but does it really merit this attention?

News

After using the Mapples app as his default navigation app for a week, Sarvesh draws a comparison between Google Maps and Mapples

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ