There was some indication from from S Gopalakrishnan, Joint Secretary, MEITY about how the ministry is looking at its consultation on Intermediary Liability protections. Gopalakrishnan was speaking at the Future of Tech Policy in India event organised by SFLC.in. Key takeaways from his comments:

  • Many problems need many solutions: “In the Intermediary Liability guidelines, we’ve got a lot of feedback, the focus has been multi-fold. That creates a bit of a confusion in approach. Now we’ll trying to solve the right problem with the right solution. Social media and election needs a particular response. Fake news on social needs needs another response. Traceability needs a different response.”
  • Regulation likely to be based on size: “The types of players are diverse and responses are diverse. The other is the practical issues of going about implementing them. The idea is that you don’t need to bother with [regulating] every messaging service, only those of a particular size. “
  • What MEITY is not looking to do: “There has been a clear understanding of different areas. There is a lot of misunderstanding as well. There is no attempt to get into censorship. The intermediaries are also not keen to become censors.” Traceability is needed but “not at the cost of encryption or privacy.”
  • Some platforms can be difficult to deal with: “If there’s any illegal content, there’s a requirement to understand the problem and provide a solution. Our approach is to ask the intermediaries to give us a solution. Some of the big intermediaries refuse to see this side of the problem. We say that sometimes somethings like this happens, and how do you help us solve the problem, some are appreciative, and some are absolutely indifferent.”
  • Anonymity cannot be guaranteed: “Misuse of the platform has a huge social cost, a huge political cost”…”There are people who misuse stuff, and there needs to be more accountability. If it is possible that you can hide behind the anonymity, and if it is guaranteed that I will never be traced, that’s a sure recipe for disaster.”

Some other comments at the discussion

ACP Siddharth Jain, Delhi Police:

  • Intermediary liability if intermediaries can’t give access to information: “The first principle is the rule of law. The freedom of speech is important, and encryption is important, the end to end encryption is important. Sometimes that leads to organised crime, spread of hate speech. At that time, the intermediary should become liable if they can’t give access. We can’t reach those persons through call data analysis.”
  • On metadata: “A dedicated unit has been set up in Dwarka by delhi police for metadata and forensic analysis. We are building capacity of our people. Since technology is improving at a fast pace, we have a battle. [Intermediaries] are not giving us metadata.”

S Chandrasekhar, Microsoft:

  • Need nuance in terms of level of control: “When the law was made, intermediary was supposed to be a dumb platform. You have TSPs, ISPs, broadband, hotspots etc. These are purely dumb. Just because you’ve provided a platform, we need to have a nuance in terms of what is that intermediary doing, and what is the level of control that the provider of the platform exercises. If I’m providing IoT, or a cloud platform I’ll be an intermediary. How will it be used is beyond my capability. The second part is a not-so-dumb platform, where you’re curating the content. If it becomes an intelligent platform where the provider of the platform has some control, we need nuances in terms of what it’s doing and the level of control being exercised. If you have a simplistic approach, it’s like you’re trying to catch whales, but you’ve made the net so fine that even the prawns get caught”…”Then there’s the issue if social media platforms, which is what the government is trying to solve. There are two ways of going about it: one is the speed of response. Second is content moderation. In the prajwala case, the problem of removing non consensual adult material was discussed. We tried to see if it’s possible to have automated tools. Current levels of technology do not allow using AI to remove that. It’s quite a complex issue. My recommendation is that the law has to be nuanced, and we have to make changes to control different types of harms. Some other levels of accountability, we need more nuance in terms of the size of the intermediary. We get it that the government has a problem of knowing which throat to choke (in terms of identifying whom to reach out to, to act), so to speak.”
  • On Content Moderation, no questions about two issues: Globally we have arrived at 2 things which are decidedly evil. One is child porn. Second is terrorism. These are only two things where no one asks any questions.
  • On other issues, context matters: “a swastik sign is natural in India, but is EU or America people will take great offence to it.”…”Cannabis is legal somewhere, and its punishable by death elsewhere. We have a system of country specific content. For example in India we have an issue of pre-natal sex determination. We had made some India specific changes.”
  • Technical challenges: “Then comes the issue of technical moderation. To give you an idea of the extent of the problem, in YouTube 300 hrs is uploaded every minute. Current level of technology [for moderating that] is not adequate.”
  • Process matters: “We push back on, for example, removal of cartoons. Just like in telecom there’s a system of requests going through a process. I shouldn’t get calls from yawatmal or pondicherry that some cartoon has been put up.
  • Response time concerns: “We have a time of 72 hours for normal cases, but for cases of terrorism and heinous crimes, it is a matter of hours. Since Saturday, Sunday, overseas they don’t work, we should keep it at 72. If we make a structure, we can make a good beginning.”

Amrita Choudhury, ISOC Delhi and CCAOI:

  • No binaries, need more research: “No one has been able to figure out how to do this effectively. It cannot be addressed in a binary way. To manage this information crisis which we are facing today. It cannot be the responsibility of a single stakeholder. It has to be collective responsibility. One thing which helps is if governments do more research before coming up with certain regulations. Intermediary is a very large term, and cybercafes are intermediaries.”…”Civil Society needs more funding for research.”
  • New users more vulnerable: “New users are more vulnerable to Fake News. Governments are trying to regulate companies, which is in conflict with freedom of speech.”…”the new users or people who are using the Internet, need to have more education: what the hygiene is and the etiquette is. They need to fact check before forwarding the news.”
  • Companies need to be transparent about the work of algorithms. More importantly,

Faiza Rahman, NIPFP:

  • Draft rules go beyond IT Act: “To my mind that these draft rules are not the correct way to go about it: They overstep the mandate of the primary law, which is the IT Act. They put onerous obligations which have an impact on privacy and deviate from the IT Act. These changes shouldn’t be made through draft rules. It should be made through amendments of the IT Act.”
  • No differentiation based on size: “These guidelines don’t differentiate between types of intermediaries. The law should have a calibrated regulatory approach, where the approach is commensurate with the activities of the intermediaries.”
  • Intermediaries have evolved: “In order to understand whether certain intermediaries need to be regulated, we need to understand that the role have changes. Can we compare them to passive delivery boys? Some intermediaries know about every click. Has the level of knowledge vis a vis our content changed? The SC in Shreya Singhal said that intermediaries cannot be expected to apply their own mind. Does this issue of capacity constraint hold true for all intermediaries?”
  • Penalties in each chase, or in case of organisational deficiency? “Germany has a law, which has valid criticisms, but it limits its application to those companies which have over 2 million German users. Secondly, they, to address overclocking, instead of penalising the platform to take down content, they penalise organisational deficiency: not one act of inability to act, but a bunch of acts. We have to be cautious while we do that any regulations have proper legislative mandate.”

Editor’s note: issue with the framing of the Intermediary Liability debate

During the audience Q&A, I pointed out to the participants that the IT Act viewed intermediaries as a function, not a type. On the Internet, the platforms evolve, and there is a mix of both content publishing and enabling communications. For example, blogs are both content creators when they publish, and intermediaries when they enable comments. Platforms on the Internet are always shape shifting, and creation of categories will restrict the evolution of services on the Internet.