From the first hearing of the Delhi Assembly’s Committee for Peace and Harmony on Tuesday on the allegations of bias against Facebook, it became clear that there is a lot that the world does not know about the platform’s content moderation policies. A number of these issues have been highlighted throughout the years but Wall Street Journal’s August 14 article, that reported the platform’s refusal to take down hate speech posted by leaders of the Bharatiya Janata Party (BJP) threw them in sharp relief in India.

As a result, the Delhi Assembly’s Peace and Harmony Committee, constituted in March 2020 after the Delhi riots, chaired by Aam Aadmi Party MLA Raghav Chadha, summoned two witnesses for expert opinions on the subject: journalists Paranjoy Guha Thakurta and Nikhil Pahwa. Thakurta co-authored  “The Real Face of Facebook in India” along with journalist Cyril Sam. Pahwa is the founder and editor of MediaNama, a digital publication that focuses on technology policy.

Other MLAs on the Committee include Abdul Rehman, Ajay Kumar Mahawar, Atishi, B.S. Joon, Dilip Kumar Pandey, Jarnail Singh, Kuldeep Kumar and Saurabh Bharadwaj, some of whom were present during the hearing.

Disclosure: Nikhil Pahwa is the founder and editor of MediaNama. As a result, he was not involved in the process of reporting or editing this article. 

Note: Some of the quotes have been lightly edited for clarity and brevity. 

Facebook’s proximity to the ruling party

Delhi Police should probe into Facebook’s role in the Delhi riots: Thakurta said that it was a “no-brainer” that Delhi police needed to launch an investigation into any possible role Facebook might have played in the Northeast Delhi riots in February 2020. He said, “Virtually each and every Hindu Muslim conflagaration, riot or conflict that has taken place in the country whether it be Muzaffarnagar, Northeast Delhi, Bengaluru there have been WhatsApp messages and Facebook posts behing them,” he said.

Pahwa noted the role of law enforcement, which needed to prosecute hate speech independently. He noted that politicians who had reportedly made hate speech ahead of the riots are still to be prosecuted by Delhi police. He wondered if Delhi police were a neutral entity who could investigate Facebook’s role in the riots properly.

Facebook employees met Chief Minister Modi as well: Citing his book, Thakurtha said that Ankhi Das, Facebook head of public policy in India, and Shivnath Thukral, head of public policy for WhatsApp in India, had been actively meeting Prime Minister Narendra Modi when he was still chief minister of Gujarat. “Facebook played an active role in training supporters and workers of the BJP on how to use Facebook,” he said.Thakurta said that Thukral had worked closely with Hiren Joshi, who was advisor to Modi in Gandhinagar when the latter was chief minister. Joshi was, according to Thakurta’s book, one of the men responsible for building Modi’s public image. He is currently serving as OSD (Communications and IT) at the Prime Minister’s Office.

Facebook is a business, not a ‘neutral’ platform: Thakurta said that Facebook could have ties similar to those it has to BJP leaders with anyone who is interested. He said that those who could pay Facebook more, would “naturally” get more help. “It is no secret that BJP is the richest political party in the country,” he said. Thakurtha emphasised that Facebook was hence not a “neutral, agnostic platform”. He also quoted Vinit Goenka, former co-convenor of BJP’s IT Cell who had reportedly told him, “We [Facebook and BJP] helped each other.”

Later in the deposition, replying to a similar question about BJP’s ties with Facebook executives, Thakurta said Facebook was indeed close to BJP, “but there is nothing illegal about it,” he added.

Pahwa argued that Facebook and Das’s actions highlighted in the WSJ report only indicated their tendency to act as a business and preserve commercial interests. “Think of Facebook as a country, and Ankhi Das as its ambassador in India. It is Das’s job to recommend what is best for Facebook. It does not have to be good for the users or Industry. She is doing her job,” he said. Pahwa added that a fear of the government suppresses free speech.

Call for transparent enforcement of Facebook’s community standards

Lack of transparency, in how Facebook’s community standards are enforced: Pahwa noted that there was a lack of transparency and accountability when it came to how Facebook implemented its community standards. “Sometimes, posts are taken down, then they are put back after a period of time. In other cases, they remain down. It is clear that there is lack of consistency in the implementation,” he said. He further noted that while news reports such as those in the Caravan indicated that posts unsympathetic of the ruling BJP government have been censored, it wasn’t enough evidence to conclude that the enforcement was selective.

Thakurta, meanwhile, said that only Facebook could answer if the content with hate speech mentioned in the WSJ report was deliberately left up by the company. “Circumstantial evidence”, however, he said, showed that the posts were up despite multiple complaints, but deleted once WSJ reporters sent queries to Facebook about them. “They (Facebook) may deny that they were complicit, but actions speak louder than words,” he said. He, too, called for more transparency from Facebook.

  • Algorithms should not take down content: Pahwa said that only human moderators could take decisions on taking down content. He said algorithms could only flag content, but they do not understand the context around it. He took the example of a Norwegian newspaper which took down a post containing the picture of “Napalm Girl” taken by Associated Press photographer Nick Ut in 1972. The post taken down as it was considered child pornography, but was reinstated by Facebook after widespread criticism for censoring a historic photograph.

Unknown if the Facebook India team has final say in the content moderation process: Pahwa pointed out that violative posts are generally flagged by algorithms, which are then examined by human moderators, who ultimately decide whether flagged content stays up or is taken down. “What we don’t know is whether the local team has any interference in this process. The WSJ story indicates that the local team and Ankhi Das made a recommendation that content not be taken down. We don’t know who took the final decision. There is a lack of transparency on how Community Standards are implemented,” he said.

Facebook doesn’t act until it’s pulled up: Pahwa said the Indian government has had to repeatedly ask Facebook to put checks, such as limiting forwards, to control misinformation in India. “Historically, this company doesn’t act until it is pulled up,” he said. Pahwa added that Facebook responds mostly to US media. “It did not act in Myanmar until a New York Times story led to the United Nations taking cognizance of it,” he added.

Not the first instance of Facebook aiding the powerful: Pahwa referred to a Propublica article from 2017 to illustrate how community standards might be enforced. “According to the report, Facebook tends to favour and serve governments that block it. There has been wide criticism of Facebook globally regarding enforcement of community standards,” he said.

Conflicting views on where liability lies

‘Safe harbour enables free speech online’: Pahwa said that social media platforms are enablers of free speech. “We are able to say what we want because there is no pre-censorship. These protections are important for enabling speech on the internet. But there need to be reasonable restrictions that are applicable under law,” he said.

‘But technology cannot be an excuse to evade responsibility’:Thakurta spoke against the idea of social media platforms, especially WhatsApp, claiming that their technology does not allow them to disclose the identities of people who post objectionable content. Referring to end-to-end encryption on WhatsApp, he said WhatsApp does not [Editor’s note: cannot] disclose the identities of people who send any messages, even those that could be considered as hateful content. He pointed out that WhatsApp has repeatedly told the Indian government that it cannot help law enforcement agencies trace the originators of pornographic, hate speech or other objectionable content. “They say our technology doesn’t permit this. But, in the name of personal freedom and privacy, can you say anything? This is not good for society. I think there should be some limits to this,” he said, “But what kind of limits, what laws, I cannot say as I am not a lawyer or lawmaker.”

  • Thakurta, however, did not explicitly call for ending end-to-end encryption. He, in fact, went on to make a seemingly contrary argument later in his deposition. He said that when Facebook CEO Mark Zuckerberg bought property in the US state of California, he bought a large area around it for privacy. “But does he respect the privacy of others?” he asked.

‘But citizens need privacy from governments too’: Meanwhile, Pahwa stated that he was against the idea of ending end-to-end encryption, as it would violate the right to privacy. “We cannot allow Facebook or the government to to view our messages on WhatsApp,” he said.

Recommendations to Committee

Pahwa argued for transparency around the implementation of community standards, while still maintaining safe harbour protections for platforms. “Safe harbour protections allow platforms to impose their own terms and conditions to protect their respective communities. This was great until platforms such Facebook and YouTube became such dominant platforms. Because of the power they have of controlling our speech, it is important for governments to come up with guardrails or some kind of norms to impose their community standards,” he said. Pahwa told the Committee that he could send his detailed recommendations to them in writing, a request that Chadha approved.

Edited by Aditi Agrawal