Decisions made by Facebook’s oversight board – which is being developed – will be binding on Facebook, and cannot be overruled by Facebook or CEO Mark Zuckerberg. The company revealed further details of its oversight board on September 17.
The oversight board has been in the works since November 2018. Zuckerberg had said in February that Facebook shouldn’t have the sole power to control content on the platform. Facebook has now pledged that the oversight body will be operational by November 2020. The board will review content moderation decisions on Facebook, cases can be brought in by Facebook itself initially, and by users at a later stage.
What the board will do: Once the board is fully staffed, it will be in charge of adjudicating appeals from users whose content has been removed from Facebook’s platforms. It will also make judgements on cases referred to it by the Facebook itself. For now, the board will begin operations by hearing Facebook initiated cases, and users will be able to appeal to the board by the first half of 2020.
What content decisions will the board deal with: Facebook will refer cases that are either
- “significant” i.e. having real-world impact in terms of severity, scale and relevance to public discourse, or
- cases with “difficulty” i.e. when the content is dispute or raises questions about current policy or its enforcement.
Users can refer cases after they have directly appealed with Facebook. To submit a complaint to the board, a user must have first exhausted all direct appeals that are part of Facebook’s moderation system. Whether users could testify in person is still up for consideration by the future board.
How will the board select cases? The board will have a selection committee of five rotating members, who will select and assign cases to panels. At lease one of the five-member panel will be from the region where the case originated. It will prepare a written decision, and submit it to the full board for review. Summary about which cases were selected and which weren’t will be published in the board’s transparency report.
What’s the membership of the board like? The board will have a minimum of 11 members, and 40 members once its fully staffed. Each member will serve for a 9 years maximum, divided into three 3-year terms. Facebook will select a small group of initial members, who will then select the other members. Members’ names and moderation decisions will also be made available in a public online database.
What kind of members will the board have? Board members should be well-versed with matters relating to digital content and governance, including free expression, civic discourse, equality, safety, privacy and technology.
Facebook will screen members for conflict-of-interest: Members must not conflicts of interest that could compromise their decision-making, Facebook said. Such people would include (but isn’t limited to): a current or former Facebook employee, or their spouse; a current government official or lobbyist working on behalf of any government; a high-ranking official within a political party; or a significant shareholder of Facebook.
The oversight board will have people from varied backgrounds: Apart from having a mix of professionals, the board will have a “broad diversity” of geographic, gender, political, social and religious representation and perspectives.
What can the board ask Facebook to do? The board can request Facebook to provide information for board deliberations, interpret Facebook’s content policies and community standards, instruct Facebook to keep or remove content, instruct Facebook to uphold or reverse a designation that led to an enforcement outcome, and issue prompt written explanations of the board’s decisions.
“The board’s decisions will be binding, even if I or anyone at Facebook disagrees with it,” said Mark Zuckerberg. The panel will decide whether the content in question should be allowed or removed from Facebook, according to Facebook’s stated policies.
- The board may issue policy recommendations to Facebook, as part of its overall judgment on each individual case.
- The board can provide policy guidance, specific to a case decision or upon Facebook’s request, on Facebook’s content policies.
Facebook’s implementation of board decisions: Facebook will then implement the board’s decision, the board can also request Facebook to apply its decision to other instances or reproductions of the same content. If a board decision includes a policy recommendation, Facebook will ‘consider’ the recommendation. The board will make case information public: Each decision will be made publicly available and archived in a database of case decisions on the board’s website, subject to data and privacy restrictions.