It is necessary for social media platforms to take all effective measures so that child pornography is not hosted on their platforms, the Delhi High Court recently held. This is a necessary condition for platforms to fulfil if they wish to receive safe harbour protections under India’s intermediary liability rules. The order came after a woman filed a plea seeking to remove her intimate pictures from platforms such as Instagram, YouTube, and Telegram, claiming they were taken when she was a minor, and were leaked online by her former partner after the two split up. LiveLaw first reported this.

A single bench of Justice Vibhu Bakhru also directed concerned police agencies to share the offending material relating to the petitioner to the National Crime Records Bureau (NCRB). NCRB has a memorandum of understanding (MoU) with the US-based non-profit National Centre for Missing and Exploited Children to receive information about online child porn content. “The NCRB shall also use the protocols available in terms of the Memorandum of Understanding entered into with NCMEC or otherwise to notify the offending material in order that the same can be actioned and removed from other platforms as well,” the court held. As per the MoU, the NCRB cannot share these reports with anyone apart from government law enforcement agencies.

How the case started

The petitioner became close friends with a person in 2012 when she was sixteen years, and had shared intimate pictures of her with him, after the person (accused) blackmailed her with suicide. However, the relationship, the petitioner claimed, was abusive and she decided to part ways with him. She then went to the UK for further studies in 2014.

However, she said, the accused visited her residence in the UK and physically assaulted her. Following this, she filed a police complaint, and the accused pleaded guilty in front of a UK court in 2017, which also passed an order restraining him from contacting the petitioner by any means, including electronic means, until January 2019. Per the Delhi High Court order, the accused moved to India in 2017.

In 2019, the petitioner moved to Australia for higher studies, and in October-November that year, she found that the accused had posted her intimate pictures on several social media platforms such as Twitter, Instagram, and YouTube. These were the same pictures that she had shared with the accused earlier, when she was a minor, the petitioner claimed.  This prompted the woman to file a complaint with the Cyber Crime Department of the Delhi Police. Her petition in the Delhi High Court was listed in July 2020.

The petitioner also sent notices to Facebook (Instagram), Google (YouTube), and Telegram asking them to immediately remove certain webpages containing her intimate pictures. However, the platforms did not remove the content, she claimed. The content was removed from Instagram and YouTube platforms in July 2020 after after the Delhi High Court ordered them to do so, since “there was no dispute that the webpages contained objectionable photographs of a minor girl”. It is not clear from the court’s order whether content flagged on Telegram was also removed.

However, even after the flagged links were taken down, the court said the same content had been re-uploaded to Instagram, YouTube and other platforms, suggesting that the offending images had been widely distributed and the same are also being uploaded by several persons other than the accused, the court said, and held:

“This brought into the sharp focus the problem of preventing circulation of identified objectionable material on the platforms operated on the net… The police authorities shall also use the protocols and resources available with NCRB and or other concerned agencies to identify the persons who are re-uploading the offensive content in India and take such actions as warranted, in accordance with law.” — Delhi High Court

Facebook, Google’s responses

Facebook and Google both touted the many measures they had taken to combat the spread of child sexual abuse material on their platforms, in their respective responses to the court:

  • Facebook said it works with the National Centre for Missing and Exploited Children (NCMEC), a non-profit organisation involved in helping to find missing children, and reduce child sexual exploitation, among other things. NCMEC has a Cyber Tipline, providing an online mechanism to receive reports of suspected child porn content on the internet. Facebook claimed once it identifies such content, it immediately removes them. The contents of the relevant account are preserved for ninety days, and the facts and circumstances associated with the same are reported to NCMEC. Apart from this, Facebook also highlighted the following technical measures it has adopted to tackle such content:
    • An option for anyone to report child sexual abuse material (CSAM)
    • Use of “Photo DNA” to identify any known or apparent CSAM image
    • Active identification of key words related to such content.
  • Google on the other hand claimed that its community standards and policies prohibit users from uploading any content that endangers the emotional and physical wellbeing of minors. On YouTube, ‘where hundreds of new content are uploaded every minute’, Google deploys machine learning for detecting, reviewing and removing content that violates its community guidelines. Google also said it has a dedicated web form that can be used by government agencies to report content that may be unlawful, including CSAM related material, which is then expedited for review by a relevant support team. Google also has a “video hashing” technology to prevent re-uploads of identical copies of video content that was once removed.

The menace of child porn online

With 1,987,430 reports, India accounted for 11.7% of all suspected child sexual exploitation reports received by NCMEC’s Cyber Tipline in 2019. India was followed by Pakistan with 1,158,390 reports (6.8%), Iraq with 1,026,809 reports (6.04%), Indonesia with 840,221 reports (4.95%) and Mexico with 827,998 reports (4.87%). In total, of all such content reported via the platform in 2019, Facebook accounted for a lion’s share — 94.34%.

Owing to the abundance of such content on the internet, and also on encrypted platforms like WhatsApp, governments have made recommendations to breaking encryption for stopping the spread of such content:

  • A Parliamentary panel in India recommended that law enforcement agencies be permitted to break end-to-end encryption to trace abusers. The panel also recommended to mandate CSAM detection for all social media companies through minimum essential technologies to detect CSAM besides reporting it to law enforcement agencies.
  • 129 signatories, including non-profit organisations, think tanks and individuals, have also urged Facebook to resist introducing end-to-end encryption on Facebook’s messaging platforms, and their subsequent integration.

In India, the Protection of Children from Sexual Offences (POCSO) Act also offers rules against hosting child porn content, which requires social media platforms to report such content or any information about its storage and dissemination, to the Special Juvenile Police Unit or local police.

Also read