Much of the discussion at the round table on Intermediary Liability protections, held yesterday in Delhi, under Chatham House Rule, by The Internet and Mobile Association of India, was not about the issue of traceability. Instead, for much of the session, participants debated other clauses of the proposed amendments to the IT Rules.

Since the discussions were under Chatham House Rule (no attribution, but the comments can be referenced), below are my notes from the discussion, without attribution. In some instances, potentially identifying information has been redacted. I’m waiving the Chatham House Rule for my comments.

On classification of intermediaries and sectoral regulation

  • “Classification of Intermediaries is good path to go as all intermediaries are not the same. Cyber Cafe is not same as ISP is not same as Social Media platform is not same as IM is not same as online gaming platform. A company doesn’t become an intermediary by self declaring itself as one, but by what it does. If twitter is publishing an update through its official handle on Twitter platform, then it is a publisher, and if others are using it, it is an intermediary. The function and the usage define whether it is an intermediary. There will be different models for intermediaries. That’s the way to go forward. We need a proactive stance on rule-making.”
  • “News and misinformation are by specific type of intermediaries and should be regulated as a separate category. There are also claims of copyright infringement. These are not challenges that a transport aggregator faces.”
  • Different regulators are regulating classes of intermediaries differently:
    • “Regional and sectoral regulations are diluting safe harbor, under RERA, consumer protection, tourism.”
    • “There’s a recent draft rules for food tech platforms, where the partners are not going to be licensed but you as network providers are liable. You have to get compliances for them, and there is liability.”
    • “How safe harbor is being affected in ecommence is that there is a need for contracts with merchants. What is troubling in terms of language in these rules is that it is too loose.”
    • “Organisations which exist in certain domains give certainty to business models. They’ve accepted that jurisdiction of these [sectoral] regulations. The problem that that creates that every other regulator wants to get into this. That’s not a path we can undo now, except if you go to court.”

These rules are about control

“The problem we’re facing is not just with respect to copyright. We’ve got content from all over the public space, and we run a [redacted] platform. This takes care of the asymmetry of information that has plagued the [redacted] sector that we are in for a long time. Fact of the matter is that even though someone is not supposed to advertise if they don’t have permission under this law, I as a comparison site am free to put up content, in terms of what is available. Diluting the safe harbor will have the effect of ensuring that there will be excessive control on platforms like mine, and it will be replicated through the Internet. These rules are about control”

My comments:

  • The Internet is full for shape-shifting businesses, because there isn’t a product market fit when someone launches a business. An example is iBibo, which began as a blogging platform, became a social network, then a music streaming service, a gaming site and eventually an Online Travel Agency. If you classify intermediaries as per a particular type, then you will restrict the ability of someone to innovate. (Addendum: Section 79 has always been about a function as an intermediary. If you leave a comment on MediaNama, we are an intermediary. If we post something, we are a publisher. If we sell reports, we’re an ecommerce site. Classification of intermediaries will have a negative impact on our ability to evolve)
  • The 50 lakh user limit is about trying to control Google, Facebook etc. They want to control large players, not realising that that will consolidate their position. If not a number, then they’re looking to create categories.

My general comments

  • Safe harbor is critical to the future of the Internet: it is what allows scale, and for network effects to enable value for users and participants in the Internet.
  • The amendments to the IT Rules seek to address four concerns: Security, traceability, misinformation, competition and taxation. So there’s a khichadi (mixed up) of problems and what we’re getting with the changes in the rules is a khichadi of solutions and there will be unintended consequences.
  • This amendment to the rules began with trying to address misinformation and that’s the only problem that the amendments should address. The industry had a code for it when it came to elections: why can’t the same be used in this case, instead of amending the rules?

On copyright violation and intermediary liability

  • “From the point of being better prepared, where there was piracy, social media companies were blamed for allowing piracy to continue, there was a pact that they signed based on a trust based framework, to use tools to bring down [infringement of] copyright infringement. That didn’t need to be a John Doe order. Disney led that. In India, every time we raise this issue, they say they’re trying, but it comes to peak dissonance between people who generate content and those who allow it to be infringed. What harm is it if we apply a trust based framework? We have to use the rule of the law to go forward.”
  • “When platforms are arbitrating copyright claims, when its identified and given to you, you have to take it down. Does the framework incentivise people to infringe copyright?”
  • “These rules aren’t the correct forum for looking at copyright issues. It should be dealt with under the copyright act. Bringing it here is ultra vires. lets have a separate discussion on copyright, under the Copyright Act.”
  • “The copyright problem is a big problem. The court in the MySpace [vs TSeries] case has differentiated from Shreya Singhal. The Copyright Act only talks about transient and incidental storage. We need a change in the Copyright Act to marry it with the IT Act. Second is that how big content files lawsuits is evidenced in the Myspace case. We’re seeing it also in T-Series vs PewDiePie. Big content brings large volume of website blocking. We need radical change in the copyright law itself. How do we use with fair use, as a statutory right?”
  • “Does the definition of actual knowledge (where there has to be a court order), allow platforms to be brazen about copyright violation? What are the other solutions? Are there frameworks and other mechanisms?”
  • “We can choose to blame each other, or that we haven’t been able to evolve a trust based framework. Outside India there have been pacts. Somewhere we have to meet midway and have a trust based framework.”
  • “Copyright is in the rules. The govt is saying that put it in your standards. Do you want something on your platform that is harmful? I’m sure not.”

My comments:

  • IT Rules and due diligence requirements are not the forum for addressing copyright violation concerns. If platforms have been responsible for misusing safe harbor, and allowing infringement for growth – and I agree they have – content firms have also been responsible for censorship, especially using John Doe orders. During the FIFA World Cup in 2014, we found that even Google’s short url and Google Docs were blocked, because those URLs were given to court as a part of a list of sites to block by STAR India. Content providers also have to be more responsible and not censor.
  • The Internet is more than just the content ecosystem. If you restrict user generated content, or if platforms end up doing it because they don’t want to take on the liability, it will become like cable TV.

On takedowns, due diligence and proactive takedowns

  • “What the intent of all this? Point 9 is about taking tech based tools to [proactively] take down content for various reasons.
  • “Do intermediaries have an obligation to partner with the government on national security? Is this the way to do it?”
  • “Proactive monitoring is a concern. AI is overhyped: you will get false positives and false negatives.
  • “The govt has made it clear that proactive monitoring is off the table.”
  • “There’s no problem in adding in our terms of use that they shouldn’t post something against the law. When brought to our notice we should take it down.”
  • “The Shreya Singhal judgment said that only the part in 19(2) should be there and the rest should be struck down. So the new additions shouldn’t be there.”

My comments:

  • Most people forget that why we went to court on Section 79 was because there were excessive takedowns and there was censorship because anyone could send a notice, and platforms were erring on the side of caution. We won’t want to go back there with pro-active takedowns
  • The government is probably not going to include proactive takedowns, and they are ultra vires to Section 79 as well, which are just about due diligence.

On informing users about terms and conditions

  • “Under due diligence, we are supposed to keep our customers informed, in terms of do’s and dont’s. What should be the timeline, it has to be there. It’s an existing practice. We inform them (only) when there is a change.”
  • “Annual circulation (of terms and conditions) should be fine. Monthly will be problematic. It won’t be good practice to keep emailing users.”
  • “Even the privacy bill talks about consent fatigue.”
  • “Should these rules define the form of informing users, or should platforms evaluate the best way to do it? Do we expect our rule making to be so precise? Should there be specificity?”
  • “It’s in the IT Act that whenever there is a change terms and conditions, we should inform.”
  • “It should not be prescriptive. We can say that we should encourage intermediaries to inform more often.”
  • I’m registered in India, my users are global. How do I comply with these guidelines? Do we classify intermediaries based on national boundaries?
  • In terms of informing your users every month about following the rules, we don’t want to spam our users. Is there a mechanism that government expects us to follow?

Speaker 1: “The user itself is not defined. Whether it is 10 year old user, 5 year old user, someone coming online on the site, or a mobile user? If a mobile user has to be sent an SMS every month? It’s not a transactional SMS so TRAI won’t allow it. Is anyone viewing your website a user? Is it an active user or a dormant user? Someone used my site 10 years ago, will I send them emails every month or 50 years?
Speaker 2: The user is already defined in the document. The user is the person using your platform.

On the 72 hour timeline for sharing information with government agencies

  • “Is a uniform timeline for a response for issues set out here (including national security)? Is it sufficient? What happens when there is conflict with other regulations? For example, electoral offences. Is a uniform threshold workable? Do we need a longer leash? Do we need different gradations of matters?”
  • “DMCA has compliances. They have a counter notice mechanism, and that is essential. Intermediaries will keep getting hit by the lack of transparency argument from users and intermediaries. In terms of number of hours, the capacity of intermediaries need to be taken into consideration. Are certain compliances necessary for platforms? There shouldn’t be over compliance by intermediaries. It may lead to over compliance, in terms of a police order without specific of criminal investigation, in terms of the level of compliance not being 100%, and submission being made in court. Have a counter notice mechanism in the notice mechanism, so that liability shifts to the user. Slug this out in court and remain intermediaries.”

“With respect to the question of timeline, we need to be cognizant of the fact that the amount of time you take to respond is now becoming a criteria for safe harbor. This doesn’t exist anywhere in the world. How legal and valid is this? In terms of the international stuff, it is impossible under the MLATS system. The average timelines range in the months. That entire platform for all its services wouldn’t enjoy safe harbor and that is problematic. Regardless of the gradation or the kinds of timelines, the fundamental question is that should such a law govern when and where data can be shared? Vast amount of intermediaries will be in violation?

  • “We have to define who can send notices. Currently the language says any government agency, instead of lawfully authorised agency. Even there that would be for emergency requests with a proper procedure, for terrorist content. One comment from foreign intermediaries said it’s impossible to do it in 72 hours. If you can do it in Europe, then I don’t see why it can’t happen in India.”
  • “The presumption in India is to look at a small niche case, but drafting it in a way that it applies in all circumstances.”
  • “One question is: most of the information that we’re asking for, we can ask under section 69. Those have specific procedures. There are no safeguards here. Under 69 there are exceptions and safeguards.”

“Generally you get requests from the police under CRPC and there’s no way you can avoid giving details. There’s no reason why it can’t be 24 hours, forget 72 hours. You just have to give an email address and IP address. You are here, you’re providing a service in this country and why should the Indian govt follow laws of a foreign country? If there’s a kidnapping in India, you’ll have to ask a foreign govt for permission?”

  • “In some instances, people ask for information in 2-3 hours. TSPs and ISPs have been doing this for ages. Why can’t platforms?”
  • “Can we specify the circumstances? There needs to be a defined mechanism. There’s no reason why it shouldn’t be done.”
  • “If you’re in a cab and you have a problem with the driver and he escapes, you need to give the police the info immediately. Even if the cab company s based in California.”
  • “I have cops in my office within hours when it comes to saving someones life. It’s a matter of life and death.”
  • “Gradation [of timeframes based on the situation] is good. It might not happen, but 24 hours is good enough.”

Speaker 1: In China, with respect to data of Chinese user, you can’t say no. If they ask for it, platforms give it. Why not in India?
Speaker 2: The reason you won’t give it is because there are International Human Rights Laws.
Speaker 1: Is a platform supposed to become a human rights organization?
Speaker 3: A platform shouldn’t also be a censorship organization. It’s fair to say that if there is evidence sought, it’s under local legislation. The broader discussion here is to see if the requirement…does it meet a good law test? Does it have the ability to abrogate general International Human Rights? You don’t make laws based on the worst case scenario. We need to come to a consensus on fringe situation where you have to share data absolutely necessarily under fringe guidelines. We need to make sure that data is not being shared as a rule, and privacy is kept at the center of policy-making.

  • “For terrorist content we might need shorter guidelines. Beyond that, for all the other data, define critical data, follow proper process.”
  • “There are strong institutional mechanisms under the current act. The point about 72 hours necessarily works out. To say that even requests that comes to use that we’re sitting on evidence, is wrong. Half the requests aren’t properly scoped out. Half the challenge we face is because requests are not legally scoped.”
  • “Lets not forget the work done by intermediaries without regulations in place. In case of the Christchurch Call, all intermediaries came together. For elections, intermediaries came together in India.”
  • “All of us focus on capacity building efforts with police agencies, law enforcement agencies.”
  • “There’s a petition in the SC challenging rules and sections of the IT act, asking for judicial oversight which will take care of these problems. On Puttaswamy 1 (Right to Privacy) and 2 (Aadhaar), judicial oversight is necessary. This might be notified under Telegraph Act or IT Act. It may benefit people to link to these rules for creating further compliances, to look at pre-existing compliances and track the progress of this case. The existing processes under telegraph and IT Act have not been successful, from LEA or stakeholders who advocate for user rights, or for private companies.”
  • “Someone from the government asked us for attendance records for a girl. We sat on it. We got lots of calls. Then we found out that they were looking for these records because of a marital alliance, and they wanted to understand whether that girl has good character. We lied and said we don’t have this data. So there needs to be a framework. We will give information when it is serious.”

Traceability

  • “Traceability needs to go from the rules. Whether it tends to break end to end encryption or not, it can be interpreted to break end to end encryption.”
  • “An ISP says that we comply with traceability, why not platforms?”

Speaker 1: “There should be traceability. People have lost their lives in this country.”
Speaker 2: “There are alternatives: there is conventional policing, there is interrogation.”
Speaker 1: People are still dying.

  • “Traceability is linked to encryption. We need to discuss traceability in the context of encryption”
  • “There are different models. There are co’s which don’t have traceability architecture, and those who do. Should government be asking for companies to change their architecture?”
  • “Traceability isn’t under due diligence guidelines. They can seek traceability under Section 69, and there are protections there.”
  • “What is traceability? Can we define it?”
  • “On due diligence, these changes are outside the delegated rule making power. Under section 69 there are wide powers. They exist. That’s the appropriate mechanism, and it’s individualised there. You aren’t requiring platform level changes there.”
  • “A single device holds multiple applications, tied to a single number, gives incredible amount of information. The metadata is incredibly rich.”
  • “Any kind of diligence requirements put in here, will apply to unintended and unforeseen was to a large category of players.”
  • “It’s not clear what problem we are solving with traceability.”

The extreme opposite of traceability is reverse lookups where an intermediary will get an image and they’ll be asked to look up where it originated from. and they’ll have to have massive database of all the content that entered the ecosystem. You’ll have to maintain a database of all content when it goes. There are public reports of CBI asking for this description. It’s like the PhotoDNA tech for child pornography.

  • “Platforms are looking at something called local device censorship, where they put hashes on your device itself. We then open up to commercial and state surveillance. Your end to end encryption as safe as the device.”
  • “There should be more transparency when platforms perform regulatory functions”
  • “A good lodestar is to stick to the Shreya Singhal judgment. Shreya Singhal is contextually important. We are moving towards data minimisation. To ensure that data is not misused, and the user has greater access to that data. The entire discussion that talks about censorship and surveillance is a consequence, it is in abrgogation of the data minimisation principle.”
  • My comment: India is moving towards data maximisation and its surveillance, not data minimisation.

Other comments

Speaker 1: “Platforms need to open an office in India, to ensure a level playing field. They operate in India as marketing entities, and that needs to create a level playing field from a taxation perspective.”
My response: “How is taxation a Section 79 concern?”

  • “There also the companies act and business rules. that you have to a presence in India, and be registered as an Indian entity. The same is in consumer protection guidelines. This is happening across sectors.”
  • “The form of this may not stand scrutiny. You can’t do a lot of these things under Section 79.”