“Facebook’s decision-making on content policy is routinely influenced by political considerations,” claimed an internal research document at Facebook seen by MediaNama. To address this issue, employees at Facebook feel the need for a firewall between content-policy and other parts of the company, the document states.
Facebook has routinely hesitated to crack down on hate speech from members of India’s ruling party fearing political backlash. A firewall around the content policy team might be the only way to ensure that such interference does not occur. Facebook’s failure to enforce such a separation raises questions regarding the need for external intervention.
The problems with Facebook’s process for content policy decisions
The internal document, titled ‘Political Influences on Content Policy’ highlights various problems with Facebook’s processes, including the involvement of Facebook’s senior executives and public policy teams in content policy decisions:
- Content policy decisions: The standard protocol for enforcing content policy includes consulting Public Policy on significant changes, and their inputs regularly protect the powerful, the document highlighted. Facebook’s Xcheck programme is also infamous for allowing different content moderation standards for public personalities.
- Algorithmic changes: Significant changes in Facebook’s algorithms for ranking and recommendations are reviewed by the Public Policy team, and “they commonly veto launches which have significant negative impacts on politically sensitive actors,” the document said.
- Involvement of senior executives: In several instances, decisions regarding whether a prominent post violates a written policy are made by senior executives including Mark Zuckerberg, the document flagged.
The document was included in disclosures made to the US Securities and Exchange Commission (SEC) and provided to Congress in redacted form by Frances Haugen’s legal counsel. The redacted versions received by Congress were reviewed by a consortium of news organisations including MediaNama.
How can Facebook resist political influence on content decisions?
To resist political influence on content decisions, the author of the document suggested that “we can and should set up a firewall between content-policy and other parts of the company,” calling for an end to interference from public policy teams.
The document even collates comments from past and current Facebook employees who demanded such a separation, including an ex-Chief Security Officer:
“A core problem at Facebook is that one policy org is responsible for both the rules of the platform and keeping governments happy… It is very hard to make product decisions based upon abstract principles when you are also measured on your ability to keep innately political actors from regulating/investigating/prosecuting the company.” – ex-Chief Security Officer, Facebook (emphasis ours)
A comment from a Facebook employee highlights that Twitter organisationally separates Public Policy from Content Policy functions, making the point that “this is possible, we have simply chosen to not do it this way.”
Call for regulation: In a testimony to the British Parliament, Facebook whistleblower Sophie Zhang went as far as to recommend regulations that mandate such a separation for large companies, arguing that the same team for enforcing content moderation policies and appeasing governments “creates a natural conflict of interest,” TIME has reported.
In response to queries sent by MediaNama, a Meta spokesperson said:
When updating our policies or when looking for input in a few critical decisions, Content Policy relies on input from many teams/functions across the company (including Operations, Engineering, Legal, Human Rights, Civil Rights, Safety, Comms and Public Policy). In these instances, Public Policy is just one of many groups consulted. And, while the perspective of Public Policy is key to understanding local context, no single team’s opinion has more influence than the other. – Meta Spokesperson
Facebook’s failure to enforce content policies for India’s ruling party
Reports over the past two years have established a pattern of Facebook’s unwillingness to remove inflammatory content by members of the ruling party in India:
- Delhi Elections: Facebook whistleblower Sophie Zhang revealed that Facebook ignored her request to take down a network of fake accounts linked to a sitting BJP MP ahead of the Delhi Elections 2020.
- RSS, Bajrang Dal: Leaked documents showed in October this year that Facebook’s internal researchers flagged anti-Muslim content by the Rashtriya Swayamsevak Sangh and the Bajrang Dal. The researchers specifically listed the Bajrang Dal for takedown, but the organisation’s pages remain live on Facebook.
- Telangana: Inflammatory posts by Raja Singh, a BJP MLA from Telangana, were left on the platform despite being marked as hate speech, WSJ has reported in August 2020. In his posts, Singh had said that Rohingya Muslim immigrants should be shot, called Muslims traitors, and threatened to raze mosques.
- Assam: Facebook flagged accounts of BJP politicians posting inflammatory content in Assam ahead of the Assam elections, but did not take them down. They also did not remove a hateful post by Shiladitya Dev, a BJP MLA from Assam, for nearly a year, TIME reported in August 2020. Dev had shared a news report about a girl allegedly being drugged and raped by a Muslim man. He said this was how Bangladeshi Muslims target the “native people.”
- No reason to remove Bajrang Dal: In December 2020, Facebook was questioned by the Parliamentary Standing Committee on IT regarding the allegations. Ajit Mohan, head of Facebook India, told the panel that the company has no reason to act against or take down content from Bajrang Dal.
Update (16 November, 05:45 pm): Responses from Meta spokesperson were added.
- Exclusive: Facebook Knew Of BJP Leaders Posting Hateful Content In Assam But Didn’t Stop Them
- Facebook Didn’t Take Down Fake Accounts Linked To BJP Ahead Of Elections, Whistleblower Claims
- Inflammatory Content Targeted Muslims On Facebook, WhatsApp Ahead Of Delhi Riots: Internal Records
- Facebook’s Failure To Check Hate Speech In India Spurs Demand For An Investigation
- Summary: All Eight Complaints Made By Facebook Whistleblower Frances Haugen
Have something to add? Post your comment and gift someone a MediaNama subscription.