A petition has been filed in Kerala High Court seeking a ban on messaging app Telegram, stating that it is used to promote child pornography and terrorism, and that it presents law enforcement issues, reports Bar & Bench. Filed by Athena Solomon K, a student at the National Law School of India University (NLSIU) Bengaluru, it claims that Telegram is widely used for transmission of sexually explicit and vulgar contents of women and children. It also claims that the app is used by pedophile groups to circumvent the ban on child porn in India.

The petition points out that the app has been widely criticized by governments across the world due to the “secretive nature of messages” on its platform. It cites Russia’s ban on the app to prevent crime and terrorism, and Indonesia’s ban to prevent radical and terrorist propaganda, per LiveLaw

“Telegram is a different world for its users with no government control,”

“Telegram is a different world for its users with no government control,” declares the petition, adding that

  • It doesn’t have an office, or even a nodal officer, in India.
  • Law enforcement agencies cannot make any sense of the Telegram data they receive from ISPs because they don’t have the encryption keys to Telegram, it says.
  • The use of bots ensures that the identity of the user is not revealed, so it gets difficult to trace who is promoting child pornography and terrorism. 

The petition was filed on October 1, and the Kerala High Court sought the Centre’s views on October 4. The case will now be heard after 3 weeks. The petitioner has made the Central Government, Department of Telecommunications (DoT), Telecom Regulatory Authority of India (TRAI), and the Kerala Police respondents to the case, per The Indian Express.

In June, Telegram had refused to hand over chat details of the ISIS module Ansar-ul-Khilafah Kerala despite several queries from the National Investigation Agency. In 2016, RTI activist and web developer Sudhir Yadav had approached the Supreme Court, asking for a ban on messaging apps like WhatsApp and Telegram, that are end-to-end encrypted, that don’t provide the government a way to access such messages, citing it as a national security concern.

What existing laws say

The present Information Technology (Intermediaries guidelines) Rules, 2011 provides that platforms have to inform users not to publish or share any content which is obscene, pornographic, paedophilic, or harms minors in any way. Platforms have to remove such infringing content within 36 hours of receiving an order from a court or other authorized agencies. 

The amendment of the rules – Intermediaries Guidelines (Amendment) Rules, 2018 is in the works, but hasn’t been notified. The new rules increase the liability on platforms, and call for traceability and proactive monitoring of content. It also grants platforms a shorter time of 24 hours to remove illegal content.