By Nikhil Sud, Regulatory Affairs Specialist, Albright Stonebridge Group.
The proposed amendments to India’s intermediary liability rules have an admirable goal: curb social media misinformation. However, the path they choose to pursue that goal presents countless concerns. Key among them is the traceability requirement which, besides being potentially technically unfeasible, poses such a serious threat to privacy, free expression, and investment that it has caught the world’s attention, with observers calling for less invasive though effective measures. But several other aspects of these amendments – five of which are discussed below – merit as much attention, because they dramatically exacerbate the grave concerns raised by traceability.
First, when do the amendments apply? The amendments state: “[T]he intermediary shall…provide…assistance as asked for by any government agency or assistance concerning security of the State or cyber security or investigation or detection or prosecution or prevention of offence(s); protective or cyber security and matters connected with or incidental thereto.” The language including and after the word “concerning” articulates the circumstances in which the government can demand “assistance” (including traceability). However, this language is broad and ambiguous. In fact, alarmingly, the preceding phrase “or assistance” suggests that even these circumstances need not exist – i.e., the government can demand traceability in any circumstance.
Second, to whom do the amendments apply? They apply to “intermediaries” as defined in the Information Technology (IT) Act: “any person who on behalf of another person receives, stores or transmits [any electronic] record or provides any service with respect to that record…” This definition casts an extremely wide net, in sharp contrast with the government’s narrow and expressly stated purpose for these amendments (curbing misinformation on social media). Though the definition proceeds to list specific services, that list is clearly overbroad. Additionally, it follows the word “includes,” suggesting that the list is not exhaustive, thereby deferring to the definition quoted above.
Third, what about judicial scrutiny? The rules do not require the agency demanding assistance (including traceability) to seek a court order. They require only “a lawful order,” which could mean an agency-issued order. This is concerning because judicial scrutiny can help mitigate regulatory overreach. Further, by not requiring a court order, the rules contradict last year’s report (authored by a committee led by Justice Srikrishna, former Supreme Court judge) accompanying the draft data protection bill. The report wisely calls for judicial scrutiny of the government’s access to consumer data. The rules also contradict the spirit of the Supreme Court’s decision last year regarding Aadhaar, which called for judicial scrutiny when the government seeks significant disclosure of personal information.
Fourth, requiring platforms to proactively identify and remove unlawful content is concerning. It risks chilling free speech and investment for many reasons. Developing such mechanisms may not be technically feasible. Further, it undermines the very nature of intermediaries by requiring them to proactively exercise editorial control. Additionally, platforms may seek to protect themselves from liability by curbing more speech than necessary. Further, this proposal ignores platforms’ existing robust policies which facilitate reporting and removing objectionable content. Additionally, demanding that platforms determine which content is “unlawful” is unworkable both legally (regulators and courts are required and best-positioned to determine that) and practically (it would be extremely onerous). Illustratively, the Supreme Court’s 2015 Shreya Singhal decision observed: “it would be very difficult for intermediaries…to act when millions of requests are made and the intermediary is then to judge as to which of such requests are legitimate…”
Fifth, the amendments are premature. They significantly impact consumer privacy but have been proposed while India’s data protection bill – designed to be an all-encompassing set of rules regarding consumer privacy – is still in development.
About the author: Nikhil Sud serves as Regulatory Affairs Specialist at the Albright Stonebridge Group. He is a lawyer by training and specializes in legal and policy issues relating to technology.