Facebook’s Oversight Board, an independent body that reviews moderation decisions by the social media giant, has chosen six cases concerning hate speech by politicians, nudity, politico-religious speech, and “dangerous organisations”, as the first batch of cases it will review in the coming days. None of these cases concern India directly.
The OSB was first proposed in November 2018, in response to several years of criticism faced by Facebook over its unsatisfactory content moderation activity. It is meant to be an independent body that can review moderation decisions taken by Facebook. The first 20 members of the board were announced in May this year. Facebook CEO Mark Zuckerberg has claimed the OSB’s decisions cannot be overruled by his company.
The OSB had started accepting appeals from Facebook users in October. In its latest announcement, it said it had received more than 20,000 appeals so far. “As the Board cannot hear every appeal, we are prioritising cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies,” it said.
Interestingly, the OSB has chosen to ignore the accusations of political bias against Facebook India executives. The OSB had earlier told MediaNama that cases such as these were what it expected to consider as it begins operating.
Five of the cases being considered by the OSB were referred via user appeals; the sixth one was referred by Facebook itself.
- A user submitted Facebook posts and tweets made by Mahathir Mohamad, former prime minister of Malaysia. Mohamad had recently sparked outrage in the aftermath of terrorist attacks in France, when he wrote that “Muslims had a right to be angry”. The user supposedly wanted to raise awareness of Mohamad’s “horrible words”. While it isn’t clear what the user expects the OSB to do, it is likely that OSB will weigh in on how Facebook should moderate hate speech by world leaders, something it has gotten into trouble for in the past.
- A user has appealed against the removal of posts in Burmese that had asked why there was “no retaliation against China for its treatment of Uyghur Muslims, in contrast to the recent killings in France related to cartoons”. The posts were taken down by Facebook for violating its “hate speech” policy, however the user has told the OSB the post was meant to empathise with human rights.
- A user appealed against the removal of alleged historical photos that showed churches in Baku, Azerbaijan, asking where the churches had gone. They claimed the city of Baku was built by Armenians. Both countries just concluded a bitterly-fought war. The post was taken down for violating Facebook’s “hate speech” policy, but the user said their intention was only to demonstrate destruction of cultural and religious monuments.
- An Instagram post by a user in Brazil was deleted for showing pictures of visible and uncovered female nipples. The post was supposedly meant to raise awareness of breast cancer. The user has appealed to the OSB against their removal.
- A user in the United States appealed against the removal of a post with an alleged quote by Joseph Goebbels from Nazi Germany, on the need to appeal to emotions and instincts, instead of intellect. The post was taken down for violating Facebook’s “dangerous individuals and organisation” policy, however the user has claimed to the OSB that the quote was important since they considered the US presidency to be “fascist” one.
- Facebook referred a case in France, wherein a post criticising a French government health body for not authorising hydroxychloroquine for COVID-19 treatment was taken down by it. Facebook told the OSB that this case is an example of the challenges it faces while moderating misinformation during the pandemic.
Each of the cases will be assigned to five-member panels. Each panel will have at least one member from the geographical region the case pertains to, the OSB said. They will be decided and implemented by Facebook within 90 days. The OSB is seeking comments from the general public on each case.
As of now, the OSB will only hear appeals in matters where Facebook has already taken content moderation decisions. Users will not be able to flag content they want removed, a feature that will be introduced in the coming months, the OSB had earlier said.
- Oversight Board, Facebook’s independent appeals body, goes live
- ‘Won’t shy away from tough cases’: Facebook Oversight Board in response to questions about platform’s bias