High-profile users on Facebook are exempt from some or all of the social media giant’s rules, according to internal company documents accessed by the Wall Street Journal. A company programme known as ‘cross-check’ or ‘Xcheck’ reportedly allows millions of VIP users to post on the platform unchecked by Facebook’s Community Standards. When questioned about whitelisting practices, the company also reportedly misled its Oversight Board, telling them that Xcheck only impacted ‘a small number of decisions’.
Xcheck was initially intended as a quality control measure for action taken against high-profile accounts. The programme, however, ended up granting immunity to public figures from action, including in cases where their posts amounted to harassment or incited violence, according to a 2019 internal review of the company’s whitelisting practices accessed by WSJ.
In the past, Facebook has routinely claimed that all users on its platforms are held to the same standards. But the WSJ report shows that when it comes to enforcing Facebook’s guidelines, the company maintains explicit segregation between ordinary users and VIPs. The company’s decision to mislead its own Oversight Board also raises questions about whether the Board can keep Facebook in check.
Xcheck users got away with harassment, inflammatory claims: Facebook Internal Review
Facebook’s confidential 2019 internal review of the company’s whitelisting practices recounted multiple instances where public figures were treated differently when found in violation of the company’s guidelines. In 2020, Xcheck allowed posts that violated its rules to be viewed at least 16.4 billion times, WSJ reported.
Sexual Harassment: In some instances, posts from whitelisted users that amounted to harassment were left unchecked. The review mentions the case of footballer Neymar Jr. who, in an attempt to defend himself against a rape accusation in 2019, posted a video that contained his correspondence with his accuser, including naked pictures of the accuser, on both Facebook and Instagram. Because of Xcheck, these posts which clearly violated the accuser’s consent were left on the platforms for an entire day before being taken down, by which time 58 million people had already seen it.
Facebook’s policy on non-consensual intimate imagery is straightforward: that it should be deleted, and users who post such imagery immediately banned. Neymar Jr., however, was protected by Xcheck. According to the internal review, the video had serious harmful consequences for the accuser, including “the video being reposted more than 6,000 times, bullying and harassment about her character.”
Incitement to Violence: “When the looting starts, the shooting starts,” Donald Trump posted on Facebook in May 2020. An automated system designed to detect whether a post violated company policy had scored Trump’s post 90 out of 100, indicating a high likelihood of a violation. Before he was banned from Facebook in June this year, Trump was an Xcheck user, and Zuckerberg personally made the call that the post remains on the platform, as he later admitted publicly.
How many users are part of Xcheck and under what criteria?
At least 5.8 million users were part of Xcheck in 2020, documents accessed by WSJ revealed. The 2019 internal review found that differential treatment under Xcheck was both widespread and ‘not publicly defensible’.
Most Facebook employees were allowed to add people to Xcheck, and 45 teams from across the world had been involved in whitelisting practices. An internal guide to Xcheck eligibility cited that users who were “newsworthy,” “influential or popular” or “PR risky,” could be added. Users were not notified that they had been added to the whitelist.
While there were rough guidelines on who belonged in Xcheck, Facebook had no clear-cut rules or strict criteria for whitelisting. The Xcheck whitelist is “scattered throughout the company, without clear governance or ownership,” according to Facebook’s ‘Get Well Plan’ from 2020 accessed by WSJ.
Did Facebook mislead its own Oversight Board about Xcheck?
In May this year, Facebook’s Oversight Board upheld the company’s decision to suspend Donald Trump’s account. The board made 19 recommendations for future action in its verdict, one of which was to report on error rates and consistency of determinations made through the Xcheck process, as opposed to ordinary enforcement procedures.
A month after these recommendations, Facebook told the Oversight Board that Xcheck was used for a small number of decisions, and refused to follow the recommendation. “It’s not feasible to track this information,” Facebook wrote in its responses.
In a written statement to WSJ, a spokesperson for the Oversight Board mentioned that the board “has expressed on multiple occasions its concern about the lack of transparency in Facebook’s content moderation processes, especially relating to the company’s inconsistent management of high-profile accounts.”
Facebook identified issues itself, is working to phase Xcheck out: Spokesperson
When asked about Xcheck, Facebook spokesperson Andy Stone told WSJ that criticism of Facebook was fair, but Xcheck was designed “to create an additional step so we can accurately enforce policies on content that could require more understanding.” Stone also emphasised that Facebook found these issues itself, and is working to resolve them:
A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross check and has been working to address them – Facebook spokesperson Andy Stone
After the WSJ report was published, Stone took to Twitter and said, “In the end, at the center of this story is Facebook’s own analysis that we need to improve the program. We know our enforcement is not perfect and there are tradeoffs between speed and accuracy.”
- Facebook Oversight Board Quarterly Report: What Suggestions The Company Has And Hasn’t Implemented
- Facebook Oversight Board Upholds Decision To Suspend Donald Trump, With A Caveat
- After Oversight Board, Facebook Mulls Similar Body For Global Election Matters
Have something to add? Post your comment and gift someone a MediaNama subscription.