In the petition filed by WhatsApp against the Indian government over the IT Rules 2021, the messaging platform has detailed how it deals with child sexual abuse content without breaking end-to-end encryption.
The implementation of end-to-end encryption means that only the sender and recipient can decrypt and see the content of messages.
It’s worth noting that an ad-hoc committee Rajya Sabha Committee had recommended that that law enforcement agencies be permitted to break end-to-end encryption to trace abusers. Subsequently, the IT Rules 2021 have mandated that all significant social media intermediaries like WhatsApp have to enable the tracing of the originator of any information that it deems as child sexual abuse material.
But WhatsApp has long maintained that it has zero tolerance for child sexual abuse material being shared on its platform, raising the question about how WhatsApp deals with child sexual abuse material if it cannot access the contents of a message.
In the case of child sexual abuse content, WhatsApp relies on available unencrypted information including user reports, profile photos, group photos, group subject, and descriptions to detect and prevent abuse.
- When a user reports a message they received as spam or illegal, the user unencrypts the message for WhatsApp. This allows WhatsApp to see the contents of the message in question and determine whether it has violated its terms of service.
- WhatsApp had also previously mentioned that it uses photo-matching technology called PhotoDNA to proactively scan profile photos for images of child abuse.
- If WhatsApp detects any child sexual abuse image on its unencrypted surfaces, it removes the image and bans the user as well as associated accounts within a group.
- WhatsApp also shares the image along with associated account details to the National Center for Missing and Exploited Children (NCMEC). NCMEC, in turn, provides India’s National Crime Records Bureau with access to India-specific reports through a secure Virtual Private Network (VPN) connection.
- WhatsApp also provides a monthly report to India’s National Crime Records Bureau with the NCMEC report IDs pertaining to Indian users.
What do the IT Rules say?
With regards to child sexual abuse material, the IT Rules 2021, which went into effect on May 25, require all social media intermediaries to:
- Inform users to not share any information that is harmful to a child
- Remove child sexual abuse content within 36 hours if issued court order or notified by appropriate government authorities
- Remove child sexual abuse content within 24 hours if reported by the victim or by an individual on behalf of the victim as part of the grievance redressal mechanism
Significant social media intermediaries like WhatsApp further have to:
- Enable the tracing of the originator of any information that it deems as child sexual abuse material
- Build automated tools to proactively identify child sexual abuse content and notify users trying to access or share them
- Publish periodic compliance report every month mentioning the details of complaints received and action taken thereon
- Attempts To Access Online Child Sexual Abuse Material Spike In UK During COVID-19 Lockdown
- India Leads In Generation Of Online Child Sexual Abuse Material
- YouTube Removes More Than 800k Videos And 78k Channels That Violated Its Child Sexual Abuse Imagery Guidelines
- Social Media Platforms Should Take All Measures To Not Host Child Porn: Delhi High Court