Technology Coalition — a global group of 18 technology companies including Amazon, Apple, Microsoft, Facebook, Google, PayPal, Snapchat, Adobe, GoDaddy and others — announced a new plan to combat online child sexual abuse on June 10. The plan, called Project Protect, will include multi-million dollar investment in Innovation Fund, a cross-industry technology that can fight child sexual exploitation and abuse (CSEA) online; multi-stakeholder approach to dealing with CSEA where stakeholders will include governments, law enforcement, civil society, research centres, hotlines, first responders, social workers, educators, etc.
For this plan, the Coalition has partnered with the Global Partnership to End Violence Against Children (EVAC) and WePROTECT Global Alliance (WPGA). The Coalition will act as a resource for the whole technology industry in the fight against child sexual exploitation and abuse (CSEA). EVAC will head the research arm of the project and will be conducted and analysed independent of the Coalition and its members.
As of now, the project is focussing on putting in place a structure for the Project, membership models and hiring.
The announcement of Project Protect comes at a time when there has been an increase in the volume of child sexual abuse material (CSAM) created and shared online. India accounts for the highest number of suspected online child exploitation reports received by USA’s National Center for Missing and Exploited Children’s (NCMEC) CyberTipline. Of all the reports made to NCMEC by electronic service providers, Facebook accounted for 94% of them. Even in the UK, April 2020 saw a significant spike in the number of attempts made to access known child sexual abuse imagery.
In light of Facebook’s, and especially Messenger’s, massive share in reporting CSAM to NCMEC, in February 2020, 129 signatories had urged the company to resist introducing end-to-end encryption on Facebook’s messaging platforms, and their subsequent integration. The governments of USA, UK and Australia had written an open letter to Facebook to not introduce end-to-end encryption, at least not without backdoors for law enforcement. Terrorism and online child sexual exploitation were the governments’ reasons for asking Facebook not to implement it. In response, Facebook and WhatsApp had absolutely refused to build backdoors and was supported by 58 civil society organisations around the world.
End-to-end encryption, which is already the default on Facebook-owned WhatsApp, disallows anyone, except the sender and the receiver, to monitor the communication in any way, thereby turning all communication opaque to any automated or manual monitoring for CSAM. Thus far, Facebook has primarily relied on automated monitoring of content on its social media platform and on Messenger to report and take down CSAM. Microsoft uses PhotoDNA, a technology it developed with Dartmouth, to target CSAM. Through this technology, Microsoft basically creates a “hash”, a unique digital signature, of the image and then compares it against hashes of other photos to find copies and take them down and/or report them.
The issue of tracing originators and distributors of CSAM on end-to-end encrypted messaging platforms has been brought up in the WhatsApp traceability case in India as well. There, Senior Advocate Kapil Sibal, on behalf of WhatsApp, had argued that even on a matter as serious as child pornography, WhatsApp’s hands were tied because of end-to-end encryption. However, WhatsApp also uses PhotoDNA to scan unencrypted profile photos of WhatsApp users to find and take down accounts of people and groups engaged in sharing CSAM. The ad hoc Rajya Sabha committee, led by Jairam Ramesh, had recommended that law enforcement agencies be allowed to break end-to-end encryption to trace distributors of child pornography. The committee had been constituted to find out ways to prevent sexual abuse of children and prohibit access and circulation of child pornography on social media.