Facebook removed 8.7 million pieces of content which violated the network’s policy against child abuse; while claiming that 99% of this content was removed before anybody reported it. It did not mention the timeline for or during which it did this. The company trains its employees (specific teams likely)  to review and report content which include people from the law enforcement, online safety, forensics, and so on.

Facebook said that it uses an AI and machine learning tool which automatically flags content containing child nudity. It said it will collaborate with Microsoft and others to build tools to detect “grooming” – a term used to mean ‘gaining the trust of children to use them for sexual exploitation.’

Facebook said its moderators flag content, who then report it to the National Center for Missing and Exploited Children (NCMEC). Apart from that, it also requires children to be above 13 years to have a Facebook account. It also takes action on nonsexual content like pictures of children in a bath.

The company works with 400 third-party organizations, companies and people to prevent such content from being shared on Facebook.

The NCMEC said that it expected to receive 16 million child abuse tip offs from tech companies this month.  It will work with Facebook to develop a software to prioritize how cases should be reported to law enforcement

Child abuse content rampant on Facebook

  • According to the Guardian, the tool will also be applied to Instagram. Last month, Business Insider reported that Instagram’s newly-launched video platform IGTV recommended disturbing videos to users, “including what appeared to be child exploitation and genital mutilation.” The publication monitored the app’s recommendations for three weeks to find this content. .
  • Last year, BBC reported that Facebook users were able to continue to exchange sexualized content about children via groups. BBC reported 100 images, of which Facebook blocked only 18 and said the rest did not violate their community standards.
  • An older BBC investigation from 2016 had revealed that pedophiles were exchanging child pornography content via secret groups. Facebook had then said it would  improve its flagging and review system.

See Facebook’s internal manual on non-sexual child abuse content here.

Loopholes in tool

Facebook’s machine learning tool obviously has its loopholes: two years ago Facebook blocked a Pulitzer prize winning photograph of Vietnamese children running away from a napalm attack : the picture feature a nude child, burnt and fleeing from the scene. Facebook had then said that it would made exceptions for art and history related content.

Facebook enlists how and where to report images and videos circulating on the platform so that it can be blocked or taken down. The company also enlists country-wise organizations, governmental and non-governmental, which can assist those who witnessed child abuse or been a victim.