- Definition of “social media intermediaries” too broad. Exceptions for some use cases would have made great sense.
- Need clarity on the 5 million threshold. How do significant social media intermediaries define “5 million users” — daily or monthly active users, or registered users and so on?
- 5 million number is very low in India
- Need consultations for an “intelligent definition” for intermediaries
- Personal liability of chief compliance officer a might be too much risk for one person
- Timelines for grievance redressal might be too onerous on companies
- Rules might achieve opposite of curtailing platform power
- Compliance burden on smaller platforms high
The Information Technology (Guidelines For Intermediaries And Digital Media Ethics Code) Rules, 2021 notified earlier this year change the way the internet is governed in India. The Rules prescribe several new and onerous requirements for companies. For the first time ever, social media intermediaries have been carved out as a separate category. Large social media intermediaries have been prescribed additional responsibilities which could significantly impact their business operations in the country.
In a panel discussion held by MediaNama on the Impact of the IT Rules 2021 on Intermediaries, experts shared their views on intermediaries, social media intermediaries and what complying with the new Rules means to companies. The panelists included Urvashi Aneja, co-founder and director of Tandem Research; Harshitha Thammaiah, General Counsel, Xiaomi India; S Chandrasekhar, Group Director, Government Affairs & Public Policy, Microsoft (India); and Rahul Narayan, advocate-on-record at the Supreme Court. This discussion was supported by Google. All quotes have been edited for clarity and brevity.
Platform power: Do the rules even address this?
The key idea behind the Rules seems to be to address the growing ubiquity and impact of internet platforms on the lives of regular people. But do the Rules succeed at this? Urvashi Aneja argued that not all platforms are the same. “It’s probably not a good idea to try and regulate all platforms the same way.” she said.
Some platforms have become ‘public utilities’: Aneja argued that some platforms take on “infrastructure properties”, where they start operating as public utilities. “Those kinds of platforms, we probably need to regulate differently than we do with other platforms, “ she said.
But, focusing on big platforms will hurt smaller ones: Grouping all platforms in the same bracket will negatively impact smaller platforms, Aneja said. In fact, the nature of the Rules are such that they may end up increasing the power of big platforms.
“With regard to the IT Rules 2021, some of the requirements of platforms in terms of compliance are fairly stringent, and they’re quite intensive. And smaller platforms may not be able to meet that bar. So in that sense, they could end up crowding out the smaller platforms, and in some sense, increasing the power of the larger platforms.” — Urvashi Aneja, co-founder and director of Tandem Research
Online harms can be addressed without increasing platform power: The Rules give intermediaries considerable power by making them responsible for taking down objectionable content. Aneja argued that this increases platforms’ curatorial powers, instead of curtailing them or addressing them in any way. Instead, she suggested that online harms could be addresses by alternate methods such as:
- Revising competition policies to control monopolistic practices of big platforms, as is being done in other countries
- Audit algorithms that are used to amplify content on platforms. “I would like to see much more energy on being able to audit their algorithms to be able to check the kind of amplification that is enabled by their algorithms. They’re optimized to spread sensational content. Or they are optimized to at least spread content that gets more eyeballs. So I would like to see much more attention on that part of it if we’re trying to address platform power,” she said.
Platforms are not neutral: Speaking on categorising significant social media intermediaries, Aneja said that platforms could not be simply seen as neutrals or “go-betweens”. “They, in fact, do much more. I would use two frames: if you think of them as publishers, they should have certain responsibilities that come with being a publisher,” she said. She suggested that they could be made responsible for being transparent about their advertising.
On the other hand, they could be seens as curators of information. “There are algorithms that curate what you see, what you see more of, what you see less and who sees what. […] So it’s not just the content that is on these platforms, that’s an issue actually I think the content is not the issue the issue is what happens with the content. And that’s where the agency of these platforms comes in,” she said.
Don’t focus on content, focus on virality: The real issue is not the content itself, but it is the scale at which it propagates on the platform. “I feel like the focus here really should be on the virality question. And that’s where the power and agency of these platforms comes in. And that’s what we should be focusing our attention on,” she said.
5 million users threshold
Is 5 million too small a number? The 5 million users threshold prescribed for social media intermediaries to be considered “significant” is very similar to the one prescribed in Germany’s Network Enforcement Act (Netzdg). Germany’s threshold — in a country with a population of 83 million — kicks in at 2 million. In comparison, is 5 million too small a number in a country of 1.3 billion?
How are users even counted? In the absence of proper guidance, a company can count its users in any way it wants — daily active users, monthly active users, registered users and so on. Microsoft’s Chandrasekhar, said that there needed to be more clarity on this aspect. He added that if it was affecting entities that were not very big, such as startups, the threshold needs to be lowered. “It is definitely an arbitrary number. Five lakhs [5 million] would have been too less for a country like India.”
- No way to audit users: Platforms have no incentive to indicate they have more than 5 million users, since they will have considerably higher compliance burden
“From the point of view of a service provider, for us, it is not very difficult to get the monthly active users, for total number of registered users, yes, it is not impossible, but then it becomes a little difficult there are a lot of redundant kind of accounts, which people just open and leave it. But monthly active users are not difficult to track down. So probably that will make more sense, as a number.” — S Chandrasekhar, Group Director, Government Affairs & Public Policy Microsoft (India)
Does the government have capacity to policy so many significant intermediaries? The low threshold, and the broad definition, would presumably cover a large number of companies in the country. Does the government have enough regulatory capacity to enforce the Rules on so many companies?
Chandrasekhar, speaking from his experience of having worked in government in the past, said that enforcement shouldn’t be so difficult for the government as long as it bolsters its manpower a bit. He argued that the regulator only steps in when there is an event or incident. “They don’t need to proactively police. They need to put very good laws and then handle the exceptions well […] Remember, the regulator is only creating the law, it is not creating a gateway as to who can join this club and who cannot join this club […] The regulator is more like an umpire blowing the whistle every time something is not being played according to the rules. For that you don’t need a huge capacity”
Additionally, panelists also noted that in the absence of third-party audits, it isn’t clear if the government can truly validate the number of users a company has declared.
Is size of user base the right approach? Aneja, on the other hand, said the platforms couldn’t be simply grouped by their user base. “I am not sure the user base is a sufficient method. In India, for example, we already know how multiple people use single mobile phones, platforms like Facebook etc. For me, what would be an adequate threshold is when we talk about certain platforms having infrastructural properties. What’s relevant is the extent to which other services are dependent on them,” she said.
Exceptions in social media intermediary definition
The IT Rules 2021 define “social media intermediary” as “an intermediary which primarily or solely enables online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services”. This rather broad definition encompasses services like Skype or Microsoft Teams, even though they are sometimes meant for enterprise and business users.
‘Leaked draft with exceptions was better’: Interestingly, a leaked version of the Rules had exempted services that enabled commercial or business oriented communications. Chandrasekhar argued that this definition was much better than the current one. “[T]he leaked draft was much better, it gave a lot of clarity as to who is included and who is not. And one of the areas, which we would be requesting the government to look into is how to carve out an exception to those entities whom they didn’t want to cover,’ he said.
Taking the example of Microsoft products such as Azure, Office 365 and Xbox, he said the company was duty bound to remove certain types of content and that it already has a mechanism to do so. However, classifying the company as a social media intermediary was probably a “wrong classification”. “So these are some of the gaps in the existing draft,” he said.
‘Would have been better if regulation captured nuances’: Different intermediaries may have a different kind of liability, said Chandrasekhar. For instance, an intermediary that offers IoT services will have different liabilities as compared to one that offers AI services. To address these differences, he argued, Microsoft suggested a system wherein “dumb pipes” or “dumb storage” are considered different from software-as-a-service (SaaS), since the latter involves a greater degree of curation.
An internet service provider (ISP) or a telecom service provider would fall under the category of “dumb pipe”; if a platform offers infrastructural service [like a clould service provider, for example], it could be a “dumb storage”. “Now when it comes to software as a service, then there is some amount of what to say curation or some amount of control over it. And then you have platforms where the users exercise a greater degree of control, there is virality, there are more possibilities of harm. So this is what we had suggested. It would have been probably better if the regulation had all these nuances and more captured that would have caused much less confusion,” he said.
‘An advisory body could do the trick’: There could be some authority to decide who could help businesses verify whether they are covered or not, said Rahul Narayan, a lawyer. “I don’t think that the government or the authority actually intends to make business difficult for entities which shouldn’t be covered,” he said.
Narayan suggested that this body could work in a way similar to that of the Central Board of Direct Taxes (CBDT) wherein it plays an advisory role to income tax payers in the country.
Personal liability of compliance officer
The Rules require significant social media intermediaries to appoint chief compliance officers who can be held personally liable for non-compliance of any provisions in the Rules. It’s a little harsh, felt Harshitha Thammaiah, General Counsel at Xiaomi India. Thammaiah argued that the mechanism was problematic considering how vague the Rules were.
“Look at what we’re trying to do here: a user puts up content. And that content, somebody says, needs to be taken down on grounds such as decency and morality. I have 36 hours to take it down. And again, it comes upon the social media intermediary, or an intermediary as per this rule, to acknowledge and to dispose of such a complaint. So while disposing of such complaints, what am I doing? I’m actually in charge of seeing whether I should protect that person’s freedom of speech or not. That’s a hell lot of power for the social media intermediary or intermediary to sit in judgement.” — Harshitha Thammaiah, General Counsel, Xiaomi India
Operational issues with compliance duties: Thammaiah argued that it would be one thing to remove content on a court order. But in the case of user grievance, the compliance officer has to make a judgement. “Then what is the recourse that the person who disagrees with my adjudication, That’s something that nobody knows. And I mean, even if that person disagrees with me, does that person have the right to go before a court, does he have resources, etc., to go fight?”
Compliance officer goes ‘straight to jail’: Thammaiah called the Chief Compliance Officer’s position a “very vulnerable” one, since the Rules allow the government to take direct action. “[I]n cases where we are talking about something to do with national sovereignty, or if you’re talking about criminal defamation, or anything under the Unlawful Activities (Prevention) Act, etc… If those things — which are pretty draconian— are involved, this Chief Compliance Officer is directly going to the jail.”
On a lighter note, she added that she didn’t even know how to hire someone for this role as they would be responsible for “pretty much everything” under the Rules.
Grievance redressal and takedown timelines too short?
The Rules require intermediaries to acknowledge user complaints within 24 hours, and dispose of them in 15 days. Thammiah felt that the 15-day timeline was far too difficult to adhere to. “The assumption is that you have all the infrastructure present in front of you to actually get the matter [resolved] within 15 days,” she said. Taking a multinational company as an example, she said that there are decision making powers outside India and 15 days was a very short period of time to be able to adjudicate and dispose of a complaint.
‘Need an army to deal with complaints’: The issue is amplified by the scale of a company’s products and services. With that scale, and since anybody can register complaints, companies would need an army to resolve them within 15 days.
‘72 hours to give all information is difficult’: The Rules require intermediaries, on receiving an order, to give up information to assist law enforcement agencies and other government bodies. Thammaiah said that without a definite scope in the information sought, it could be “really, really hard” to comply. “Sometimes, law enforcement agencies make a request for information that I might not have directly. I might have to procure it as an intermediary to an intermediary. Then, three days becomes very difficult […] I don’t think the rules do justice at all, as far as the timelines are concerned.”
Transparency reports an additional burden: The Rules require significant social media intermediaries to publish periodic transparency reports which information on complaints received, action taken and so on. Thammaiah said the way the provision is worded, companies would have to compile very large reports, which would need considerable effort. “Even if it is the silliest grievance, I need to start compiling in a certain way, and file it with the government. I think that’s a huge, huge compliance burden,” she said.
How will big companies comply with the Rules?
An attendee asked how tech giants like Google and Microsoft will comply with the rules, considering their myriad offerings. Microsoft’s Chandrasekhar said that his company would consider each individual service as an intermediary. “Let us say Teams, it has a particular purpose; Hotmail has a particular purpose; Outlook has a particular purpose; Bing has a particular purpose. Each one of these services are intermediaries. Each one of these services have different levels of compliance, each one of these services have different levels of obligations. Irrespective of one service might be really huge and very popular, one service may be less popular. But we are ready to, or rather, we are working to find a way, to comply with the obligations of each individual service.”
‘Informing users of policy changes an excessive ask’
The Rules require intermediaries to inform their users on a yearly basis any changes to their privacy policies and terms of service. They are also supposed to inform users every year that non-compliance with any of the regulations could result in the termination of their accounts. Thammaiah felt that this would result in spamming the users’ inboxes, as they are unlikely to read these emails. Taking the example of Xiaomi’s Themes app, she said the app does not have any financial information or other such sensitive details. However, the Themes app too would be obliged to send messages to users periodically. She said this would fatigue the users to the extent that they would get dissuaded from reading any other important information communicated to them.
- #NAMA: The Traceability Mandate And What It Means For End-To-End Encryption
- Transcript And Video: MEITY’s Rakesh Maheshwari On IT Rules, 2021; Traceability, Intent, Compliance Timelines