wordpress blog stats
Connect with us

Hi, what are you looking for?

What are the problems highlighted by Meta’s Oversight Board in its Cross-Check System?

Several issues highlighted, including prioritisation of business interests over human rights, with the X-check content moderation system

FILE PHOTO: A 3D printed Facebook's new rebrand logo Meta is placed on laptop keyboard in this illustration taken on November 2, 2021. REUTERS/Dado Ruvic/Illustration/File Photo

“In our review, we found several shortcomings in Meta’s cross-check program. While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” read the report by Meta’s Oversight Board reviewing the company’s cross-check (XCheck) programme.

The Board was asked to conduct a review by Meta following a report in The Wall Street Journal highlighting deficiencies in the system published in October 2021. Medianama has reviewed the copy of the Board’s report to prepare this summary.

“The Board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm,” the report continued.

The board said that Meta failed to track data on whether the system results in more accurate decisions. It also expressed concern about the lack of transparency around the programme in addition to making recommendations to improve the programme.

Primer on Oversight Board: It is an independent body that people can appeal to if they disagree with content moderation decisions made by the company on Facebook or Instagram.

How does XCheck work: The argument put forth by Meta is that it deals with huge volumes of content and it is bound to make mistakes by removing content that does not violate its policies. The cross-check programme tries to address this issue with an additional layer of human review for “certain” posts identified initially as breaking its rules.

  • The content posted by users on Meta’s cross-check lists is not immediately removed as it would be for most people, but is left up, pending further human review called ERSR (Early Response Secondary Review).
  • The programme’s scope was widened in late 2021 to include certain posts flagged for further review (General Secondary Review [GSR]) based on the content itself, rather than the identity of the person who posted it.

Why it matters: The report is crucial as it offers critical insight into the flaws that plague the cross-check system which makes over 100 million decisions on content every day. The programme had come under a lot of criticism as the system was found to favour high-profile users. Moreover, the company was accused of misleading its Oversight Board by telling them that the programme only impacted “a small number of decisions”.


FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.


Key takeaways from the report

“Meta’s content moderation mistakes include over-enforcement and under-enforcement, meaning that Meta both removes non-violating content and fails to remove violating content,” the report concluded.

The board said that it analysed the cross-check system in “light of Meta’s human rights commitments and stated values, raising important questions around how Meta treats its most powerful users”. It wrote that a content review system must treat all users fairly but the cross-check program “responds to broader challenges in moderating immense volumes of content”. Here are some of the flaws highlighted in the report:

Broad scope to serve contradictory objectives: The report stated that Meta said that the cross-check program serves a “core business function”, as it serves an “important role in managing Facebook’s relationships with many of (its) business partners.” It shed light on the correlation between the cross-check tag sensitivity framework to the “degree of reputational and internal backlash that is anticipated if particular content is removed in error”.

  • “Correlating highest priority within cross-check to concerns about managing business relationships suggests that the consequences that Meta wishes to avoid are primarily business- related and not human rights-related,” the board said.
  • “Meta has told the Board that it has no comprehensive system in place to systematically assess which journalists, human rights defenders or civil society figures in a particular geography should be subject to ERSR. This raises the risk that significant gaps and inconsistencies exist in terms of who is afforded the added layers of protection for expression that cross-check ERSR provides,” the report said.

Unequal treatment of users. “Cross-check grants certain users greater protection than others,” the report concluded, adding: “If a post from a user on Meta’s cross-check lists is identified as violating the company’s rules, it remains on the platform pending further review. Meta then applies its full range of policies, including exceptions and context-specific provisions, to the post, likely increasing its chances of remaining on the platform”.

  • The Board said that Meta “focuses disproportionate attention on more lucrative markets, instead of focusing on contexts with greater risks to human rights, including freedom of expression”. For example, 42 percent of content reviewed through the ERSR pathway originated from the US or Canada.

Delayed removal of violating content: “When content from users on Meta’s cross-check lists is identified as breaking Meta’s rules and while undergoing additional review, it remains fully accessible on the platform,” the report contended. “ERSR and GSR eligibility persistently exceeds the human review capacity Meta allocates to the cross-check program,” the report continued. These flaws compound the disparities in treatment of different users on the platform.

  • “Privileged users enrolled in ERSR have more chances to be reviewed by a moderator who may apply context to uphold their content, have a greater range of policy exceptions that can apply to uphold their content, and benefit from a system where even violating content is guaranteed viewership for some period of time,” the report argued.
  • Meta also said that it can take more than five days on average to reach a decision on content from users on its cross-check lists.

Failure to track core metrics: “The metrics that Meta currently uses to measure cross-check’s effectiveness do not capture all key concerns,” the report summarised. The board was concerned about the fact that Meta did not provide information on whether its decisions through cross-check are more or less accurate than through its normal quality control mechanisms. It said that decisions made by contracted reviewers are concerning because they do not have the same access or training as Meta employees.

Lack of transparency: “The Board is concerned about the limited information Meta has provided to the public and its users about cross-check,” the report argued. The board said that Meta does not inform users that they are subject to ERSR. It also does not inform users when they report content posted by a cross-checked entity, the board said.

What were its recommendations?

The Board made a total of 32 recommendations to Meta, and below is a summary of some of its main inputs.

Prioritise human rights: The board said that Meta should prioritise expression that is important for human rights, including expression which is of special public importance. Furthermore, users that are likely to produce this kind of expression should be prioritised for inclusion in lists of entities receiving additional review above Meta’s business partners, the board added.

  • It suggested that posts from users to produce content dealing with human rights should be reviewed in a separate workflow, so they do not compete with Meta’s business partners for limited resources.
  • It asked Meta to not rely on the number of followers but it should not be the sole criterion for receiving additional protection.
  • “If users included due to their commercial importance frequently post violating content, they should no longer benefit from special protection,” the board said in its report.
  • “The content posted by entities that Meta should include based on human rights concerns should be reviewed by teams with context and language expertise,” read the report, adding that this team does not report to public policy or government relations teams or those in charge of relationship management with any affected users.

Increase transparency around cross-check operations: “Meta should measure, audit, and publish key metrics around its cross-check program so it can tell whether the program is working effectively,” the board wrote in the report.

  • “The company should set out clear, public criteria for inclusion in its cross-check lists, and users who meet these criteria should be able to apply to be added to them.
  • It called for a public mark for categories of entities protected by cross-check which includes state actors, political candidates and business partners. It also proposed that Meta should ensure that cross-checked content, and all other content, can be appealed to the Board.
  • Another recommendation directed Meta to reserve a minimum amount of review capacity of teams that can apply all content policies to increase the impact of a content-based false positive prevention system.
  • The report said that Meta should provide clarity regarding appeals eligibility and ensure that content that does not reach the highest level of review is able to be appealed internally.
  • “Meta must guarantee that it is providing an opportunity to appeal to the Board for all content the Board is empowered to review under its governing documents, regardless of whether the content reached the highest levels of review within Meta,” the report concluded.

Reduce harm caused by content left up during review: “Content identified as violating during Meta’s first assessment that is high severity should be removed or hidden while further review is taking place,” the board said, suggesting methods like downranking, slowing the virality, hiding, or temporarily removing the content.

  • “Meta should invest the resources necessary to match its review capacity to the content it identifies as requiring additional layers of review,” the report added, clarifying that the company should not have the algorithm select less content.
  • “More generally, the Board notes that providing greater transparency to external researchers, in particular access to data, is an essential component of oversight for mistake-prevention systems,” read a suggestion by the board. It said that independent researchers could provide Meta with valuable insights on the impacts of its choices.
  • The board also asked Meta to monitor its activities which impact rights on a periodic basis, and said that Meta should provide the public with information about how this system is functioning.

This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Also read:

Written By

I cover several beats such as Crypto, Telecom, and OTT at MediaNama. I can be found loitering at my local theatre when I am off work consuming movies by the dozen.

Free Reads

cyber crime

News

The software, based on the Geographic Information System (GIS), will be tracking active mobile numbers that have been used for cybercrime.

News

While the chatbot directed us to the Election Commission website for queries like "Tell me about BJP", the AI did put out a response...

News

The US House of Representatives had previously passed the "Protecting Americans from Foreign Adversary Controlled Applications Act” bill on March 13, but the bill...

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...

News

Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...

News

The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...

News

Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...

News

Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ