“What’s interesting is that there has been talk in the past about [expecting] trust and accountability from platforms,” noted MediaNama’s Editor Nikhil Pahwa at our “Exploring User Verification” roundtable last Thursday. “But, in the case of fraud, that requirement [of verifying one’s identity] seems to be shifting from platforms to individual users. Why is this issue coming up?”
The roundtable saw tech policy experts explore broadening government mandates to ‘verify’ Indian netizens online. The hope seems to be that allowing only verified users to transact on platforms will prevent fraud and bolster online security. The catch is that these rules make being anonymous on platforms difficult—harming the right to privacy of individual users of the Internet. So, whose responsibility is it to ensure that the Internet is a safe space? And, more importantly, is user verification a sure-shot road to safety online?
“A company can be as violative of privacy as the government,” opined Pranesh Prakash, Co-founder of the Center for Internet and Society. “[For example] Financial companies are in a position of wanting as much information about you as possible rather than as little. So, they’re not doing a risk calculus in terms of how little information they can gather about their customers in order to provide certain services. They’re saying, ‘let me gather as much as I can’. So, we need the government to protect us from these companies as well when it comes to privacy. This is, for me, a great concern with pushing for user verification requirements everywhere.”
MediaNama hosted this discussion with support from Meta and Truecaller. The Internet Freedom Foundation, CUTS International, Centre for Internet and Society, and the Centre for Communication Governance at the National Law University, Delhi, were MediaNama’s community partners for this event.
STAY ON TOP OF TECH POLICY: Our daily newsletter with top stories from MediaNama and around the world, delivered to your inbox before 9 AM. Click here to sign up today!
Platforms have no clear standard on what anonymity is, resort to data-intensive status quo: Varun Sen Bahl noted that one reason platform verification often hurts privacy is because there’s no clear standard of what it means. So, platforms rely on information-intensive verification methods.
“Because we [industries] don’t have clarity around the expectation that can be placed vis-a-vis online anonymity, we are probably seeing standards of offline anonymity being expanded outwards,” observed Bahl, a Public Policy Manager at NASSCOM. “Our default understanding of when we verify a customer comes from the financial sector [through KYC forms and more]. Now, CERT-IN [India’s emergency cyber response team] has extrapolated that [standard] and used it as a reference point to design verification in a cybersecurity context.”
“The problem then happens when [for example] you look at the [draft] Telecom Bill, you have a legislation with a vaguely worded verification requirement using strong words like ‘unequivocally verify’, et cetera,” Bahl argued. “Then, because the service provider doesn’t know what that standard means and how it can be achieved, we [they] default to whatever is seen as the best method of verification so far, which will go back to the financial sector [note: the sector has stringent identification mandates].”
Verification mandates push platforms into a corner : These concerns are amplified by another sword hanging over platforms—their legal obligations to comply with vague verification mandates.
“You [a company] don’t know what the cost of a particular system can be,” Bahl continued. “I think traceability [of users under India’s platform regulation rules] is the worst example of this, where a requirement was introduced before we had a mechanism to solve for the requirement. Then the pressure is created on service providers that there is a legal requirement [to comply with], which has been hinged on their safe harbor. They must now try to find a solution to that legal requirement or likely be unable to qualify for that safe harbor. That burden is on the platform now”. These forces work together in such a way that platforms have no reference point for a verification standard Bahl argued.
A reminder: safe harbor clauses protect platforms from being held liable for third-party content they host, provided that the platforms comply with the Indian government’s rules and regulations.
“[Ultimately] You have to be cognizant of the trend and trajectory of these verification mandates,” noted DeepStrat Co-Founder Anand Venkatnarayanan. “Here, what the state is really doing is telling platforms that ‘this is my mandate, and you have to adhere to it. If you don’t, you lose [safe harbor] protection’. That’s very different from the state saying that platforms are not doing their jobs, and we have to do some positive interventions to protect our citizens. This is the reason the surface attack area on people [for cybercrimes or fraud] is increasing.”
“If you take an example of a bank, why don’t they investigate frauds? Because they say ‘we are quite helpless in this matter and we are following every single norm and condition the state has imposed on us’,” Venkatnarayanan argued. “By putting verification mandates on businesses, you are extinguishing their capability to actually service their customers in a creative way that protects their interests.”
Government also abdicating responsibilities through verification requirements for platforms: “Our original discussion was about how platforms are being required to engage in activities to enforce rights related to preventing fraud, harm, and misinformation, all of which require them to engage in privacy violations through verification,” noted Lalit Panda, Senior Resident Fellow at the Vidhi Centre for Legal Policy. “That question is distinct from the question of whether private parties should have obligations to protect the privacy of individuals. I think the latter question has a very strong case, private parties and platforms do need to be involved heavily in engaging in privacy enforcement. But, the former case is where we have exactly the kinds of problems we have been talking about. It’s an abdication of responsibility by the government on various grounds.”
This shifting burden on platforms is seen both online and offline, observed advocate Vrinda Bhandari. “An interesting way you see this offline is through Section 144 orders. I recently did a study where we analyzed all the Section 144 orders by the Delhi Police. In one year, there were over 6,000 orders. When you analyze them, you find the police are directing private actors to install CCTV cameras so that they can see who is coming in. The shifting burden is something we’re seeing both online and offline. And none of these Section 144 orders says what is going to happen with all of this surveillance data.”
“These laws are increasingly putting positive obligations on consumers to reveal themselves and be less anonymous,” CUTS-CCIER’s Research Director Amol Kulkarni added. “It looks like these positive obligations perhaps are prerequisites for [consumers] getting protections under the law.”
Government agencies dealing with fraud need to be held accountable too: Technology platforms, including banks, are doing their bit in tracing the movement of money, noted an audience member. “But, for it [resolved fraud compliance] to really come back to you, you need an intervention from law enforcement to step in for that little last bridge to get filled,” she said. “That never happens because, more often than not, they are overloaded with so many complaints. Unless you have money in hundreds of crores, it’s really not worth their time to invest in a few lakhs of rupees. So I think there is a need for accountability coming in even from law enforcement, [especially] when the whole other ecosystem is fulfilling their responsibilities of accountability.”
Shift to individual responsibility part of India’s “data empowerment” trajectory: “I think there is a broader directional shift where the focus is on data empowerment of individuals,” noted another audience member. “There is this whole idea that individuals will share their data, and there will be commodification of this data, which will allow a data economy to take off in a big way. Part of this fundamental shift is putting more and more onus of proving who you are and creating a standard ID across platforms and services.”
“Essentially, [that’s the shift towards] personal data as a national asset,” Pahwa added.
Using innovative privacy-friendly tech to get verification done: “With mandated verification, you are taking away the creative ways in which there could be services where the state mandates no verification,” advocate Prasanna S. added, referring to an example raised during the discussion of whether a landlord should be allowed to not rent out their premises to a terror-accused. “Can the law allow that kind of discrimination?” Prasanna asked. “In order to prevent these discriminatory harms, it is possible that the state may have to creatively come up with interventions where you cannot ask for certain information, because it knows in society there exists the possibility of misusing it. The elimination of these creative interventions is another reason we may need to think about this.”
“What is very important to realize is that we have the tech to engage in user verification with as little information transferred as possible,” Prakash also added. “So, even in those places where user verification for certain reasons legitimately becomes required, it should be done in such a way that it’s minimally invasive of people’s privacy.”
“I think that is a Data Protection Bill question,” Pahwa mused. “For me, the primary challenge is that there are systems in existence where there is no opt-out [from data sharing] that exists. Looking at the scope creep [on privacy] that’s taking place [through verification too], you are moving towards a scenario where an opt-out option may not be available to you. That availability, purpose limitation, all of those are privacy considerations that will hopefully come with the Bill…We seem to be moving more towards verification than protecting privacy.”
Conversations on horizontal applications of fundamental rights are concerning: The forthcoming Digital India Act is set to regulate the Indian Internet. One thing it alludes to is the horizontal application of fundamental rights Bahl noted. That’s when rights are applied “in transactions where private actors are involved in some way”.
“The more and more you say that rights can be applied against them, the more and more discretion you are forcing on them to some extent,” Bahl argued. “So, if, for example, you have a future traceability requirement, and then simultaneously a requirement that the intermediary must respect the privacy of the user, then the burden of resolving that lands up on the intermediary as well.”
“What we’re heading towards is basically [that] fundamental rights apply to intermediaries and not to the state,” Pahwa observed. Bahl added, “more importantly, fundamental rights trade-offs are being decided by intermediaries.”
Horizontal application of fundamental rights remains unchallenged: Bhandari examined how petitions currently address platforms’ alleged violations of fundamental rights.
“The way you [petitioners] reach platforms [on fundamental rights issues] is making the Ministry or Union of India a party,” explained Bhandari. “They become respondent one [in a petition] then each of the platforms become [other] respondents. The issue of maintainability [on whether fundamental rights claims can be filed against platforms] has been brought up in various petitions…We have seen this in the WhatsApp privacy policy case that is ongoing at the Supreme Court. WhatsApp and Facebook have argued that there is a maintainability challenge, asking how can the WhatsApp privacy policy be violative of fundamental rights? I mean, that’s as much as horizontal application [of fundamental rights] as you can imagine…Platforms are not really contesting it, in the sense that they have realized that this [maintainability] argument is not currying favor with the Court[s] at this moment.”
Clear responsibilities will shift verification costs back to the government: “What we need now, and maybe the Digital India Act will give us that opportunity, is to finally arrive at a well-defined set of expectations around e-governance, public services and digital services,” Bahl argued, referring to Europe’s ‘once-only principle’. “If we can finally arrive at clarity that the government should be required to follow [data] minimization as much as possible, and create a well-defined set of rules around how government bodies share information, then we can reduce the amount of user verification being imposed on society and shift the cost back onto the government sector.”
Once-Only Principle says that citizens should not be forced to provide information to authorities if another authority already holds that information in electronic format.
Learn more👇 https://t.co/WGjKMliHE7 pic.twitter.com/aUD14byEN8— DIGIT 🇪🇺 (@EU_DIGIT) November 11, 2022
“There is a trade-off here between encouraging more intra-government data sharing or more user verification requirements,” Bahl continued. “But it seems like right now we are hurtling towards both: creating as many user verification requirements as possible and also allowing the government to share within itself as much as possible.”
This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.
Read more
- #NAMA: The Traceability Mandate And What It Means For End-To-End Encryption
- How Do India’s Growing Verification Mandates Impact Companies And Industries? #NAMA
- Why Is A Scammer The Best Fintech Founder In The World? #NAMA
- Can We Map A Framework For Verification? Varun Bahl On A Model For Proportionality #NAMA
I'm interested in stories that explore how countries use the law to govern technology—and what this tells us about how they perceive tech and its impacts on society. To chat, for feedback, or to leave a tip: aarathi@medianama.com
