In the run-up to and during Myanmar’s general election, Facebook will remove misinformation around elections. For instance, it said it will remove posts falsely claiming a candidate to be Bengali, not a Myanmar citizen, and hence ineligible. Facebook is also limiting forwards on Messenger to five at a time, starting with Myanmar.

Facebook said on Tuesday that its expanded misinformation policy in Myanmar includes content around voter suppression and which may damage integrity of the electoral process. Between now and November 22, it will remove “verifiable misinformation and unverifiable rumours” around the above.

This election will only be the second general election in Myanmar’s over 70-year history. Between the last general election in 2015 and now, the country’s militia led a crackdown in 2017 that forced over 700,000 Rohingya Muslims to flee the country. United Nations investigators said Facebook played a key role in spreading hate speech against the Rohingyas that fuelled the violence. In 2018, Facebook admitted that it was used to “foment division and incite offline violence” and that it could have done more.

An April 2018 Reuters investigation found at least a 1,000 examples of posts attacking the Rohingyas or other Myanmar Muslims. Most were in Burmese, and included material that had remained undetected for six years. In June 2018, Facebook only had 60 people reviewing reports of hate speech posted by Myanmar’s then 18 million active Facebook users.

Between April to June 2020, Facebook said it acted against 280,000 pieces of content in Myanmar for being hateful. In the first quarter, it had removed 51,000 pieces of hate speech content.

Other steps Facebook will take for Myanmar elections

  1. “Paid for by” disclaimers for political and electoral ads, archiving ads about social issues, elections or politics to the Ad Library
  2. Limiting Messenger forwards: A message can only be forwarded five times on Messenger. This is a tried-and-tested way of adding friction to the platform. This has been rolled out in Myanmar, and will be rolled out worldwide and the coming weeks.
  3. Verifying the official pages of political parties, 40 parties given verified badge so far. This is being done with 2 partners in Myanmar.
  4. Adding context when images over a year old that are potentially harmful are shared. Users will be shown a message when they try to share specific types of image, including those more than a year old, or hose that are violent. These warnings are added using both automated tools and human review.

It’s worth noting that Facebook’s announcements are not always fully enforced. For instance, Facebook had reportedly declined to act after discovering that India’s ruling Bhartiya Janata Party was circumventing Facebook’s political ad transparency requirements during the 2019 general elections. The BJP had reportedly used newly-created organisations to buy ads worth hundreds of thousands of dollars that didn’t disclose party affiliation, in violation of Facebook’s rules. Instead of taking down the pages or flagging the ads, Facebook raised the matter privately with the BJP, the report noted. Besides, limiting forwards are intended to add friction to the platform, as has been implemented on WhatsApp globally. However, it is an imperfect system since text can be copy-pasted instead of being “forwarded”.

Facebook said that it recognises “Myanmar’s complex social and political context” and is “sensitive to the tumultuous changes and the serious violence that took place since the country’s last election in 2015”. The company said it has worked over the past few years “to better understand” how it is used in Myanmar.