“Traceability based on the message that you have is possible through metadata analysis, but traceability on the basis of searching the content of the messages is not possible,” a speaker said during MediaNama’s discussion on encryption. The bigger problem is that the technological capability of law enforcement is limited. All speakers concurred that the problem of end-to-end encryption in counter-terrorism operations was not a technological issue, but a policy and legal issue that needs to be solved.

The issue was being discussed as part of a workshop, held by MediaNama, with support from Internet Society (Asia Pacific Office), on challenges to encryption in India. Note that the discussion was held under the Chatham House Rule and thus quotes have not been attributed.

Challenges to encryption

1. End-to-end encryption is the issue

A speaker explained that when it comes to encrypted communications, there are largely two models:

  1. Blackberry model (the escrow model): Decryption keys are held at the centre in an escrow. Gaining access to such keys is technically feasible; it is only a matter of establishing procedure in terms of when to give access, in what situations, what level of access, etc., the speaker explained but another speaker pointed out that there is a massive trust deficit in this model. The NSO-Pegasus incident exacerbated the deficit. Dr V. Kamakoti’s affidavit in the Madras High Court proposed this as one of the two models.
  2. WhatsApp model: The end-to-end encryption poses a bigger challenge because we can either have it or now. “It is not a matter of giving access at that point,” a speaker said. “If you have a backdoor to end-to-end encrypted communication, you don’t have end-to-end encryption. Period,” another participant said. In this case, intermediaries do not have the technical ability to provide information to law enforcement, even if they want to, another speaker said. There are only three ways to avoid that: not implement it, implement it “incorrectly”, or compromise the endpoints, that is, hack into them as in the case of the NSO Group using a vulnerability in WhatsApp to plant Pegasus spyware in victims’ devices.

Where do different companies stand with respect to these two models?
Google: “Google has done a really good job in terms of establishing a policy that says what level of access they will provide under what circumstances. Without a warrant, if somebody owns an account, with a warrant they will give you more information,” the speaker said.
Apple: This is a case of recoverable encryption. The case of Apple is completely different where Apple does not store any passwords so if you don’t have a password, you don’t have a password. A speaker said that Apple’s law enforcement access policy states that they cannot supply device password because they do not store them.

2. Courts do not want to hear civil society’s ‘absolutist positions’

Two speakers said that when civil society takes an absolutist position in a matter of national security, such as saying “no access to content at all”, courts take a firmer stand against privacy activists and lawyers because they are weighed against terrorism. “The court does not want to be held responsible for some terror attack,” a speaker said. “It is not enough to say that something is unconstitutional unless the judges really buy it and issue a judgement in your favour,” they said.

Lack of technical clarity in courts: Much has been said in courts about whether or not traceability is possible alongside end-to-end encryption, especially during the WhatsApp traceability case in the Madras High Court and the Supreme Court. Conflicting expert affidavits from Dr V. Kamakoti, Dr Manoj Prabhakaran and WhatsApp caused much confusion. Furthermore, it is not clear if “to the extent possible” means ensuring traceability or doing the best to assist law enforcement agencies.

A terrorist is determined post ante: During one of the hearings in the Facebook transfer petition, the Attorney General of India K.K. Venugopal asked if terrorists have privacy. A speaker during the discussion asked, “How do you decide who is a terrorist?”

3. Escrow model not feasible due to trust deficit in the authorities

A speaker distinguished between things done “in the interest of national security” and those done “in the name of national security”. In the name of national security, many lawyers and activists were targeted using NSO Group’s Pegasus spyware “without any decisive attribution as to which authority procured the spyware in the first place”, a speaker argued. They said that this demonstrates and exacerbates the trust deficit that exists in our law enforcement and our government.

This is why, a vast majority of civil society would be “vehemently against” the Blackberry model. “The general population doesn’t have that level of confidence in our law enforcement that if this kind of power is handed to them, it will be used in a conscientious, diligent way without making undue or unwarranted incursions into the private lives of citizens,” the speaker continued.

4. Metadata analysis solves the traceability problem, not access to content issue

If you have metadata, you can easily reach the endpoint. “Traceability is reasonably easy to solve for from a technical standpoint. Metadata can always be maintained. You don’t need to know the content of the conversation, but you can always know who spoke to whom, etc.,” a speaker explained. Another participant said that a number of law enforcement authorities in the country are being trained by domestic experts and international intelligence agencies on how to use the metadata to trace the origin of these messages. “Once you have access to content through one of the hundreds of nodes that have received misinformation, you can still trace it back to the originator using metadata. You don’t need access to the content of all of these individuals who forwarded this problematic message,” they said. “Traceability based on the message that you have is possible, but traceability on the basis of searching the content of the messages is not possible,” a speaker said. The problem is that the technological capability of law enforcement is limited.

Once built, metadata analysis is not time-consuming, except in extreme cases: “From a technical standpoint, I don’t think it is time-consuming because once you have got it built, it should be a matter of a few seconds, minutes or hours, if it’s a really long thread. The challenge typically comes from a legal standpoint in terms of the time frame to get that [judicial] order versus the criticality of the situation,” a speaker explained, but they also said that in a situation where someone is roaming around with a weaponised device, time constraint is a big blocker, “which is why law enforcement agencies are asking for real-time access”.

5. Cloud encryption can be easily reversed, but that too is moving towards end-to-end encryption

Law enforcement agencies, in order to prevent terrorism, are always on the lookout for people who might be getting radicalised. To that end, they trawl through cloud servers and cloud storage providers such as Dropbox, OneDrive, Google Drive, etc. to track content. “If you tell me what data is it that you want me to trace, I can hash it and run a search across my entire cloud platform and tell you which accounts have it and give you more context around those people,” a speaker explained. And unlike end-to-end encryption, another speaker said, “cloud level encryption can be easily decrypted depending upon the defensibility and pliability of organisations”. “Unlike the Apple case where you need the password to get access to the device, in case of cloud, there is no such technical incapability. There is nothing to prevent them from sharing data with the law enforcement if they wish to,” they said.

However, the first speaker said that even in terms of file storage, we are heading towards end-to-end encryption where even the file storage service provider will not be able to read or scan the file.

Solutions: Encryption is a question of policy, not technology

There was consensus amongst the speakers that even if traceability were not an already solved problem (through metadata analysis), the debate around encryption is a largely policy debate, not a technological debate, centred around establishing procedures. All speakers largely agreed that this is not a case of if access has to be granted to law enforcement agencies for counter-terror operations, not how and when.

1. A hybrid model of encryption like Zoom

A speaker proposed Zoom’s hybrid model as a potential solution to the perceived problem of end-to-end encryption. “The model Zoom has taken up is that for unauthenticated users, that is for free users who haven’t really authenticated with Zoom, you are given a centralised encryption which is that can be cracked and is reversible at any point of time because they hold your keys. For authenticated users, people who pay with credit cards, people who have essentially gone through the KYC process, they can enable end-to-end encryption and use it in that scenario. End-to-end encryption on its own can’t really be weakened by definition that the keys only exist with the two parties and not at the centre. What can be done is a sort of a hybrid model,” they explained. For example, “If there is a surveillance requirement, let’s say WhatsApp defaults back to traditional encryption for that particular set of users or conversations as opposed to weakening overall encryption across the platform. This weakening, of course, would be done after obtaining the necessary judicial order,” they said. The number would already be authenticated by the telecom service provider and for access to “the numbers that interact with known bad entities, authorities will have to go through a judicial process of getting them weakened encryption so that communication can be intercepted,” they explained.

Problems with this approach:

  1. There is a “commodification of privacy” in this case, where those who can afford a higher tier of privacy will avail it while others will not, a speaker said. This may not necessarily be in line with the Puttaswamy Right to Privacy judgement.
  2. Since end-to-end encryption is reserved for authenticated customers, that is, “anybody who is a paying customer”, and anonymous payments over the internet are impossible, there will be a loss anonymous speech that will have a chilling effect on free speech.

2. Define national security, public order narrowly; distinguish between them

There is a need to substantively differentiate between national security, public order and law enforcement. “Counter terrorism operations become especially problematic because they straddle these two categories: public order (internal) and national security (external). Not all terrorists are foreigners; homegrown terrorists can be just as dangerous,” a speaker explained. Thus, there is a need to come up with an institutional framework that distinguishes between counter-terrorism, public order and national security, and what technological capability is provided to each of these given agencies. “The technological capability that is available to, say, your local SHO should be different from the technological capability that is available to the NIA that is conducting extra-territorial investigations of cross-border terrorism, etc,” they explained. “Ram Manohar Lohia v State of Bihar actually distinguishes between law and order, public order, and security of State and envisions them as concentric circles,” another speaker said. Under that schemata, security of the State is most narrowly defined which law and order is the broadest category.

A speaker also cited the Aadhaar judgement where the Supreme Court ruled that “a mere ritualistic incantation of ‘money laundering’, ‘black money'” does not satisfy the test of proportionality. Similarly, it is not enough to just cite national security; you have to prove national security, they said.

3. Need for strict judicial procedure to prevent abuse of access to content

Two speakers pointed out that if there is no judicial model, “there will essentially be a Pegasus model that will be used to get much more information”. Talking about their experience, one of these speakers explained that a decade ago, when Google, Facebook and others would not really cooperate, law enforcement agencies in India would rely on third parties to hack into accounts to get access to communications. “The official path essentially took 2-4 months to get information. The challenge was that because the law enforcement agencies had this back end route where they would come to us and say, ‘Hey, get us access to these emails’, they started abusing it a lot. I don’t mean malicious abuse but I mean blanket listings. There were a lot of legitimate people who had nothing to do with the investigation who would show up in our tests. That’s exactly what’s going to happen if we don’t come up with a model that works because that’s what’s happening right now,” they explained.

“Once you start giving access, it is not preserved for very critical cases and access becomes the default solution for LEAs instead of conserving such exceptional access for high-priority cases. This is not a technical problem; This is a culture problem and may be a policy problem. There is a need to ensure a high threshold for granting such access, not for every and any kind of problem,” another speaker said. A lawyer compared this to imposing the Unlawful Activities (Prevention) Act (UAPA) even on the most minor of offences.

Even today, there are tons of teams across the country whose job is to break into email accounts and serves to get access for law enforcement agencies, a speaker said. Another said that even when they receive requests from law enforcement agencies, they are not in proper format and thus not actionable, and often ask for too much information.

During one of the hearings of the WhatsApp traceability case in the Madras High Court, Facebook had also argued that police personnel often abused the system and approached Facebook with personal requests. Senior Advocate Mukul Rohatgi had then argued, on behalf of Facebook, that the Tamil Nadu government had not given details about how these requests had been filed — some requests were missing FIR numbers, some were not in the proper format, some were not mandated by a court. He had assured the court that proper requests had always been considered.

4. Transparency Reports from tech companies will play a huge role in improving accountability

A speaker proposed that we could start asking for more details in the transparency reports that companies like Google and Facebook publish. “There could be more detail around the numbers as well. For example, any time the government asks Google to take the URLs down, Google should actually disclose what information here was, who asked for it. For example, it says that the Kerala High Court asked for some information to be taken down related to a lynching down and we have taken it down or we have not taken it down. Some measure of transparency there will help in a big way. This provides accountability. Are we going crazy and just using it on everybody or are we using it only where it is required?”

Problem: The government, as well as private companies, often cite national security concerns to shroud such information. Thus, it becomes even more necessary to define national security narrowly, and to prove such concerns instead of just citing them, as other speakers proposed.

Questions that remain

  • How do we enable traceability on the basis of content?
  • How do we ensure transparency and accountability without compromising on national security?
  • If someone refuses to give access to their device, do they incriminate themselves?