wordpress blog stats
Connect with us

Hi, what are you looking for?

Why Does the Delhi HC Think Search Engines Are Responsible For Taking Down Non-Consensual Intimate Images?

There’s been an uptake in the last five to ten years of non-consensual intimate imagery disputes, observed Vasudev Devadasan, co-author of a working paper on the dissemination of NCII in India

India’s safe harbour tides are swiftly changing. The latest current: the Delhi High Court’s recent judgment holding search engines responsible for disabling access to illegal non-consensual intimate images (NCII) recurrently disseminated online. If they don’t expeditiously tend to these complaints, which threaten the victim’s rights to privacy and to be forgotten, these intermediaries cannot invoke safe harbour protections under India’s IT laws. 

“The aim of this exercise is to ensure that the victim of NCII dissemination does not have to undergo any further distress. Any solution that is provided must be deliberate and proportional, and should not be akin to a remedy that is worse than the disease. This Court cannot burn the house to roast the pig.” — Justice Subramonium Prasad’s April 26th judgment

Why it matters: Based on judicial proceedings and news reporting, there’s definitely been an uptake in the last five to ten years of non-consensual intimate imagery disputes, observed Vasudev Devadasan, Project Officer at the Centre for Communication Governance at the National Law University Delhi, and co-author of a working paper on the dissemination of NCII in India. 

“I think this is entirely natural—it’s understandable that you would see a rise in these offences given the fact that almost everybody today has a camera on them in the form of a smartphone,” Devadasan added. “In terms of who the [regulatory] buck stops with, that’s rarely attributable to a singular institution. They all have their own roles to play in situations like these where new social phenomena and harms are arising.”

What is non-consensual intimate imagery?: It’s all in the name—and refers to “sexual content that is distributed without the consent of those who are being depicted in the said content”. 

NCII can impact both children and adults. “The exact difference between child sexual abuse material [CSAM] and NCII lies in the domain of consent,” explained the Internet Freedom Foundation’s litigation team. “Any material that depicts child pornography would be termed as CSAM whether or not it has been procured with the consent of the minor involved as this consent is legally irrelevant. Viewed from this perspective, all CSAM is essentially NCII.”

MediaNama reached out to Microsoft and Google asking whether they planned to challenge the Court’s ruling, the organisational changes they’ll need to make to comply with it, and how they viewed the Court’s specific obligations under IT law. Microsoft declined to comment, while we’re still awaiting Google’s responses.

Advertisement. Scroll to continue reading.

Back to basics: What is this case about?

In 2019, a married woman, Mrs X became fatefully acquainted with Richesh Manav Singhal online. In 2020, Singhal came to her Gurugram home—where she lived with her son—and forced himself on her. He took explicit pictures of her, and sent them from her phone to his to share them with her husband. The minor child was allegedly also abused by Singhal. Mrs X filed a complaint against Singhal at Lajpat Nagar Police Station after which a “Zero FIR” was registered and the investigation was transferred to Gurugram. 

Singhal later blackmailed Mrs X, threatening to leak the photos online if he wasn’t paid huge sums—Mrs X eventually handed him over lakhs of rupees and all her jewellery. Singhal eventually did leak the photos on pornographic sites, even making a YouTube channel where explicit videos and pictures were uploaded daily. The initial police complaint was updated with these offences in August 2021. 

Mrs X repeatedly approached the user grievance cells of Microsoft India (which runs the Bing search engine), YouTube, and Vimeo. She also placed multiple complaints on the Indian government’s cybercrime reporting portal. But, the images were not taken down. 

What happened then?: A case was filed at the Delhi High Court asking for the NCII of Mrs X to be taken down. In October 2021, YouTube-parent Google informed the Court that the offending material had been removed from the video-sharing platform and the URLs provided de-indexed from the Google search engine. Google added that this “did not mean that it could not be found on the internet through other search engines and that merely directing only the search engines to de-index the links would not be an adequate solution”.


FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.


In March 2022, it was revealed that Singhal, and his accomplice Shweta Chhabra, had been arrested. Over 83,000 explicit pictures were discovered on a laptop in their residence. Singhal was involved in multiple “other cases” as well. Mrs X’s initial case became “infructuous” or pointless with this revelation. But, the High Court persevered to ensure that victims don’t have to repeatedly appear before the authorities or platforms in cases like these.

What were the original rebuttals?: Mrs X could already approach the platforms under India’s platform regulation rules, IT Rules, 2021, to get the offending content removed within 24 hours. Also, “delinking/de-tagging/de- referencing/de-indexing” Mrs X’s name from search engines’ results could adversely impact the free speech rights of people with the same or similar names. Crucially, both Google and Microsoft argued that they simply index third-party content on their search engines and that they don’t have control over the information. Takedown orders could be issued to the websites publishing NCII instead.

Advertisement. Scroll to continue reading.

The Court begged to differ: It criticised the “abysmal” coordination between platforms and the Centre when it came to actually taking down non-consensual intimate images. Instead of complaining about the “onerous” obligations being placed on them, they should focus on quickly redressing complaints like these, the Court ruled. As they continue “shirking” these responsibilities, the content stays up for longer, “enabling” perpetrators to continue posting it without facing much legal backlash, it further observed.

Why does the Delhi High Court think search engines have responsibilities in the case of NCII?

The Court was clear that search engines have legal obligations to remove NCII regardless of their platform functions. “It is unfathomable as to how a search engine can feign helplessness when it comes to removal of or disabling access to links which prima facie contain content that is illegal as declared by the Court,” Justice Subramonium’s order testily observed.

To that end, the Delhi High Court instructed platforms’ Grievance Officers—who are responsible for dealing with user complaints under the IT Rules, 2021—to liberally interpret NCII complaints, so that it includes “sexual content obtained without consent and in violation of an individual’s privacy as well as sexual content obtained and intended for a private and confidential relationships [sic]”. These complaints should also be resolved within 24 hours, as per the IT Rules, 2021—if they’re not, then the search engine cannot claim safe harbour protections under Section 79 of the IT Act.

What was the Court’s reasoning?: Search engines are obliged to respect users’ fundamental rights under the IT Rules, 2021, the Court noted. The most obvious right being violated here is privacy. The Delhi High Court was clear that people can expect privacy, even within “the confines of a domestic relationship, or even if any intimate image is shared with another person with the understanding and the expectation that the same will not be shared with third persons.” Also as important for the Court: that people have a right to be forgotten, which is part of the right to privacy. 

Back in Puttaswamy v Union of India, the Supreme Court also observed that the right to privacy also included an individual’s right to control personal data and their existence on the Internet, the High Court added. 

What’s more, keeping NCII content online doesn’t serve any tangible purpose, and is actually punishable under Section 66E of the IT Act, which punishes someone “intentionally or knowingly captures, publishes or transmits the image of a private area of any person without his or her consent, under circumstances violating the privacy of that person”. Private area means “naked or undergarment clad genitals, *[pubic area], buttocks or female breast”. 

Advertisement. Scroll to continue reading.

Does this approach raise concerns?: Despite the IT Rules’ provision, it is not the responsibility of private companies to protect our fundamental rights, the Internet Freedom Foundation’s litigation team argued, marking a clear stance in the longstanding debate on whether rights can be applied in private transactions. “The fundamental rights guaranteed by the Constitution of India govern the relationship between an individual and the state,” they added.

But, Grievance Officers also need clarity on how to actually understand NCII as an offence. While Section 66E is very specific about the grounds of NCII, the corresponding provision in the IT Rules isn’t.

Rule 3(2)(b), IT Rules, 2021. | Source.

“Rule 3(2)(b) doesn’t talk about circumstances violating privacy or questions of consent, which are two key legal elements of this offence” Devadasan flagged. “It isn’t a charging offence, it doesn’t define NCII or what unlawful content is. That difference should be at the forefront of this discussion—it’s possible that content that doesn’t fall under Section 66E could be captured [and access to disabled] under the IT Rules. For example, content involving consensual nudity, like adult entertainment or pornography, could potentially be taken down under Rule 3(2)(b) because it doesn’t use the terms consent or violations of privacy. But, that content wouldn’t qualify as unlawful under Section 66E. Further, Rule 3(2)(b) doesn’t include any safeguards to verify that the complainant is the individual (or authorised by the individual) depicted in the video, and this is left to the discretion of the intermediary which has only 24 hours to make a determination under the Rule” Going forward, it would be ideal to coalesce around a definition for NCII [in the IT Rules] that’s similar to the one we have in Section 66E, Devadasan added.

However, determining ‘consent’ in some cases may require reasoning skills superior to a Grievance Officer. “Suppose you have a video of a sexual assault involving a public figure,” Devadasan speculated. “The victim of the assault may not want the content removed, introducing the public interest value of the content. In some situations, a judicial order would be the best way to balance those interests and get the content taken down.”

Is removing this content technologically feasible?

The Court seemingly argued that hash-matching technology, otherwise used to remove child sexual abuse material, can be used to remove non-consensual intimate images too. Not helping the search engines’ case: that Meta had already developed a similar tool to do just that. Microsoft has also developed hash-matching technologies to detect child sexual abuse material online.

Among other concerns, Google had argued that “the factor of “consent,” which is an essential ingredient for categorization of NCII cannot be detected by automated tools”. Microsoft admitted that it “does not currently possess any technology for automatically finding and deleting NCII, and can only remove the content globally upon receiving notice of its existence”. 

A representative flow of how hash-matching is used by Apple to detect and report CSAM. | Source: Apple via TechCrunch.

Nevertheless, the Court subsequently instructed search engines to issue a unique token upon the initial takedown of NCII—if the same content resurfaces, then it’s the search engine’s responsibility to use its pre-existing technologies to ensure that access to the content is disabled, without the victim having to approach the authorities again. In short: search engines can’t require the victim to provide specific URLs for the flagged NCII content each time. 

The IT Ministry and search engines could also co-develop a “trusted third-party encrypted platform” for victims to register the NCII content or URL under, the Court further suggested. Platforms could assign cryptographic hashes—or identifiers—to the NCII, and then automatically identify and remove the flagged content. This reduces the victim’s burden of “scouring” the Internet for NCII. The platform should be “subject to [the] greatest of transparency and accountability standards,” given the sensitive data involved.

Why did the Court say this?: Because it’s “unconscionable” to expect victims to identify and approach the search engine or courts each time a flagged piece of NCII content reappears on the Internet. Expecting them to do that goes against the IT Rules’ vision of removing illegal content in a time-bound manner, the Court added. 

Advertisement. Scroll to continue reading.

Why search engines may be better placed to tackle NCII: Interestingly, the Internet Freedom Foundation’s litigation team noted that search engines do not consider themselves intermediaries, nor are they treated as such under Indian law. “Nevertheless, search engines have the potential to reduce the visibility of NCII, and this was taken into consideration by the Delhi High Court in its decision,” the team added.

This partly has to do with the fact that search engines are gateways to information on the Internet—it’s virtually impossible to find something that’s been de-indexed from a search engine unless you have the specific URL of the content in question, the Court observed. So, blocking access here can stem the flow of NCII. Also, search engines may be logistically better positioned to tackle this content, Devadasan noted. 

“NCII is often hosted on independent websites that some Courts call ‘rogue websites’,” Devadasan explained. “These websites don’t have Grievance Officers, and are unresponsive to legal notices, complaints, or sometimes even court orders. In that case, there are two options—asking the Internet Service Provider (ISP) to block the URL, or asking the search engine to de-index the content. The problem with ISPs is that they have almost no control over the content on their networks. Search engines are useful in this situation because they are able to at least verify what’s on a page before de-indexing it. They also have trust and safety teams that are doing this work [of de-indexing pages] for other classes of unlawful content.”

Do the Court’s observations lead to “proactive monitoring” of the Internet?

Google argued that automated technologies cannot be used to proactively monitor third-party websites for NCII, the same way they do for consent. “They have extreme technological limitations and adverse repercussions, especially on the exercise of free speech. As they cannot effectively distinguish between content and they operate in an “all or nothing” framework, there exists the possibility that they may end up jeopardizing and removing legitimate and genuine content,” the tech giant argued.

The Internet Freedom Foundation explains why this might be. “Unlike cases involving CSAM, proactive monitoring is not encouraged when it comes to NCII, because this would have the ability to take down content that may be explicit in nature but is hosted consensually,” the litigation team noted. “AI technology cannot decipher whether or not the content that has been published has been done so with an individual’s consent.” 

Microsoft spelt out the consequences citing past cases: proactive monitoring may impede free speech and lead to “privatized censorship”. Even the IT Ministry offered a similar argument in court—last year, stakeholders told the Ministry as much when raising concerns on its amendments to the IT Rules.

Advertisement. Scroll to continue reading.

The Delhi High Court acknowledged these concerns while observing that search engines have obligations to disable access to all copies of non-consensual intimate images that are determined to be illegal. The Court argued that the filtering it was asking platforms to do wasn’t actually ‘proactive’. 

Once it has been reported by the user/victim or a Court order or an order of the appropriate Government has been rendered, then the search engine cannot contend that any filtering of the content that is done subsequent to the reporting or the Order is proactive in nature; it can only be termed as being in pursuance to the reporting of existence of such content specific to an individual or a judicial Order [emphasis added].

How do we read this verdict?: “It could be construed that the search engine is obliged to remove content without the complainant specifically identifying where the content has been reuploaded,” Devadasan observed. “If you read the judgement like that, it does raise proactive monitoring concerns. An equally plausible interpretation is that the complainant can use the digital identifier they were first issued [as per the Court’s order above] and notify the search engine. At that point, the search engine can do a quick hash match, determine if the content is identical to what was previously de-indexed, and de-index it. That’s not a proactive monitoring obligation.”

The litigation team at the Internet Freedom Foundation concurred with the second interpretation. “What the Delhi High Court has directed is for search engines to ensure that once NCII content is reported, it must not be allowed to resurface,” the team noted. “The responsibility has been placed on the search engine after the first takedown order is issued only to prevent recurrence, not occurrence.”

What else did the Court say?

  • Courts to receive sealed covers of NCII: A petitioner requesting courts to issue takedown orders for non-consensual intimate images must also file a sealed cover affidavit with the allegedly contentious specific audio, visual images, keywords, and URLs. These will be used to determine “ex facie” illegality—or whether they can be presumed to be legitimate until proven otherwise. 
  • Status tracker required: The Indian government’s “Online Cybercrime Reporting Portal” should have a status tracker monitoring the complaint from filing to removal of the content. It should also mention the redressal mechanisms available to victims in all languages listed in the Eighth Schedule of the Indian Constitution. The portal and every Delhi Police website should display the contact details of every District Cyber Police Station in the national capital territory. 
  • Immediate investigations required: Given that NCII is a punishable offence under the IT Act, when a complaint is brought to the Delhi Police, it should immediately register a formal complaint, initiate an investigation, and “bring the perpetrators to book”. 
  • Specific NCII cyber police officers needed: Each District Cyber Police Station should assign an officer to liaise with platforms in these cases. Together, they should try to ensure that the complaint is resolved within the time frames set out under the IT Rules, 2021. Platforms are supposed to cooperate and respond “unconditionally as well as expeditiously” with the Delhi Police. 
  • NCII reporting hotline should be set up: A fully-functioning helpline—with sensitised operators—should also be set up for victims to report NCII. The operators should also have a database of registered counsellors that distressed victims can reach out to. 

This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Read more

Written By

Free Reads

News

With GPS in every car, India's toll collection is going high-tech.

News

Google is currently undergoing the necessary procedures for the leasehold land, which is presently under the ownership of the Maharashtra Industrial Development Corporation.

News

The pilot project for which is being launched in 19 cities this year, with a plan to launch a full-fledged rollout next year.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...

News

Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...

News

The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...

News

Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...

News

Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ