Facebook has removed 89 accounts, 107 Facebook Pages, 15 Facebook Groups, and five Instagram accounts for engaging in “uncoordinated behaviour” originating in Myanmar; some individuals engaged in this activity were associated with the Myanmar military.

Facebook said that people behind this network created fake accounts which were identified through internal investigation to promote their content and “repurposed legitimate news and entertainment content”. The accounts also posted contents about national and local topics such as crime, ethnic relations, celebrities, and the military.

  • Followers: According to the blog, nearly 900,000 accounts followed one or more of these pages, about 67,000 accounts joined at least one of these groups. Around 400 people followed one or more of these Instagram accounts.
  • Advertising: According to Facebook, less than $1,200 had been spent on Facebook and Instagram ads which were paid in US dollars and Russian rubles.

Below are some examples of the content posted by the pages:

Facebook had previously removed three other networks in Myanmar for uncoordinated behaviour

According to the social media company, it has suspended three more networks for engaging in uncoordinated behaviour in Myanmar since 2018. In December 2018, the company had taken down 425 Facebook Pages, 17 Facebook Groups, 135 Facebook accounts and 15 Instagram accounts in Myanmar for their behaviour. The pages had links with Myanmar military.

In June, Facebook said it has started to reduce the distribution of content from people who have consistently violated its community standards in the past in Myanmar. It said it will use “learnings” to explore expanding this approach to other markets. “By limiting visibility in this way, we hope to mitigate against the risk of offline harm and violence,” Facebook’s Samidh Chakrabarti, director of product management and civic integrity, and Rosa Birch, director of strategic response, wrote in the blog post. In cases where it identifies individuals or organisations “more directly promote or engage violence,” the company said it would ban those accounts. Facebook said it has also extended its use of AI to recognise posts that may contain graphic violence and comments that are “potentially violent or dehumanising”.

Facebook has suspended accounts worldwide for similar behaviour

  • In August, Facebook removed five accounts, seven pages and three groups, run by individuals associated with the Chinese government, and which frequently posted content on political issues such as the ongoing protests in Hong Kong. The Facebook pages were followed by more than 15,000 accounts, the company said.
  • Facebook removed multiple pages, groups and accounts originating from Thailand, Russia, Ukraine and Honduras for being involved in “coordinated inauthentic behaviour” on the platform and Instagram in July this year. The company purged these accounts following an internal investigation along with help from local security forces in these areas in certain cases.
  • In May, Facebook took down multiple Russian pages, groups, and accounts on Facebook and Instagram. Facebook found that these pages focused on Ukraine and posted content relating to Eastern Ukraine, Russian politics, political news in Europe, politics in Ukraine and the Syrian civil war.
  • In April, Facebook removed 687 pages and accounts linked to India’s main opposition Congress party because of “coordinated inauthentic behaviour” on the social media platform. The company wrote in a blog post: “We removed 687 Facebook Pages and accounts that engaged in coordinated inauthentic behaviour in India and were linked to individuals associated with an IT Cell of the Indian National Congress (INC).”