Facebook has purged a network of 800 groups related to the far-right, pro-Trump conspiracy theory group QAnon, saying that the accounts violated its newly broadened policy against dangerous groups and individuals. Earlier this month, Facebook had also removed a QAnon group with 200,000 members.
Whom will this wider policy cover? Facebook said that its existing policy around dangerous organisations and individuals will now address organisations and movements that show significant risks to public safety, but are not dangerous enough to be completely banned from Facebook’s platforms. It will allow content in support of QAnon and similar groups, but will restrict “their ability to organize on our platform”.
Facebook will reduce such groups’ visibility: Apart from removing pages and groups, Facebook will restrict their traction and spread, by limiting recommendations, results in Search, and reviewing hashtags. More importantly, it will also prohibit such pages from running ads, featuring on Marketplace and Shop, and stop them from fundraising on Facebook. “We will also remove Pages, Groups and Instagram accounts where we identify discussions of potential violence, including when they use veiled language and symbols particular to the movement to do so,” Facebook said.
What is QAnon? QAnon spreads misinformation around a “deep state” conspiracy against President Donald Trump, and promotes ideas that a Satan-worshipping pedophiles secretly run the world. According to NBC News, the group has been linked to several violent criminal incidents, including a a murder and kidnappings.
The group has reportedly exploded in popularity in Facebook and Instagram since the coronavirus pandemic began, and is rife on private Facebook groups. A year ago, the FBI had said that QAnon and similar conspiracy theory groups could pose a potential domestic terrorism threat.
Facebook also removed Antifa-related activity: “For militia organizations and those encouraging riots, including some who may identify as Antifa, we’ve initially removed over 980 groups, 520 Pages and 160 ads from Facebook”, the company said.
The numbers: Facebook said it has removed over 790 groups and 100 pages tied to QAnon on Facebook, and also removed 1,500 such ads. It has restricted almost 2,000 groups and 440 pages on Facebook, and 10,000 accounts on Instagram. It has blocked over 3,000 hashtags on Facebook and Instagram, and restricted 1,400 hashtags on Instagram. QAnon content on Facebook tends to have smaller number of groups with more members, but more accounts with fewer followers on Instagram.
Twitter banned QAnon: Last month, Twitter had banned thousands of accounts posting about QAnon, and took steps to restrict their spread.
We’ve been clear that we will take strong enforcement action on behavior that has the potential to lead to offline harm. In line with this approach, this week we are taking further action on so-called ‘QAnon’ activity across the service.
— Twitter Safety (@TwitterSafety) July 22, 2020
- Facebook is looking for external auditor to assess its metrics of community standards enforcement; MediaNama’s take