Facebook’s Oversight Board will start accepting appeals on content that users believe should be removed. The Board, which acts as a court of appeals of sorts for the social media giant, has so far only been accepting appeals from users whose content had already been taken down, requesting reinstatement.
Facebook’s Oversight Board was first proposed in November 2018, in an attempt to address the growing criticism around Facebook’s lacklustre content moderation effort. The Board would review content moderation decisions by Facebook, and its decisions would be binding on the company. In the six months that it has been operational, the Board has received more than 300,000 user appeals.
How can one appeal for content takedowns? Users would initially have to appeal for content takedowns through Facebook’s own system. However, if they are not satisfied with Facebook’s decision, they can approach the Board.
“After you have exhausted Facebook’s appeals process, you will receive an Oversight Board Reference ID in your support inbox and can appeal the decision to the Board. You can appeal decisions on posts and statuses, as well as photos, videos, comments and shares.” — Oversight Board statement
The Board will select cases that are of “critical importance to public discourse” and affect many users. It can combine appeals on the same content on Facebook and Instagram. A five-member panel will then deliberate on the case and issue a judgement within 90 days.
MediaNama’s take: Oversight Board is a glorified content moderator, not an answer to Facebook’s problems
The Oversight Board, by accepting user appeals requesting content takedowns, will essentially act as a glorified content moderator for Facebook. It is virtually another level in Facebook’s content moderation operations hierarchy.
That Facebook has a content moderation problem is well known. The most famous exhibition of this problem was in Myanmar, where Facebook was used as a potent tool to propagate hatred against the Rohingya Muslim minority community in that country, leading to a genocide and mass migration. The main reason behind this was that Facebook didn’t have enough content moderators who could read content in the Burmese language.
This fundamental problem prevails: In June 2020, a report by the NYU Stern School of Business estimated that Facebook took around 300,000 erroneous calls while moderating content in a typical day. The report suggested that Facebook needed, at the very least, double the workforce working on moderation.
Last September, a former Facebook employee Sophie Zhang chronicled her struggles in the company in a 6,600-word long memo, accusing it of not paying enough attention to fake accounts spewing hateful content all over the world, including in India. Zhang said Facebook’s leadership seemed to lack the desire to protect democratic processes in smaller countries, and prioritising PR over solving issues.
The Board doesn’t address the fundamental issue of Facebook’s lower-than-ideal investments in content moderation. The Board can only do so much — it presently comprises only 19 members (the 20th member, Pamela Karlan left to join the Biden administration). Considering that the Board has received around 300,000 appeals, it cannot practically consider them all, something the Board’s co-chair Catalina Botero-Marino had admitted back in October 2020; the Board has only issued eight decisions so far.
The “Real Facebook Oversight Board”, an independent body created to counter the Board, has termed the latest development a “distraction”. “Facebook is still refusing to take responsibility for dangerous and false content that continues unabated across its platforms,” it said in a statement, calling the Board a “$130 million PR tool”.
“Instead of asking hard questions about how its platform was used to facilitate an insurrection, it has set up a pseudo-court of appeal. The Oversight Board is simply a way of outsourcing responsibility to a third party with highly limited powers. It will adjudicate high-profile individual cases as a fundamental distraction from the core problem of Facebook’s failure to address the systemic failures of its business model and algorithms,” it said.
- Oversight Board, Facebook’s Independent Appeals Body, Goes Live
- Facebook Oversight Board Announces Decisions On First Batch Of Cases, Criticises Company’s Opaque Content Moderation
- ‘Won’t Shy Away From Tough Cases’: Facebook Oversight Board In Response To Questions About Platform’s Bias
- Former Facebook Employee’s Memo Indicates Company’s Inability, Lack Of Interest In Policing Disinformation Campaigns Worldwide: Report