By Anand Venkatanarayanan

The use of instant messaging tools spread worldwide because they solved the basic human need to talk and communicate with others. In that aspect, they are no different from a postcard or a normal telephone. The distinguishing factor, however, is that these tools can do something which a postcard or a telephone can’t — allow people  to communicate without being overheard.

It might be possible to define encryption as the capability to communicate without being overheard.

Not long ago, encryption was considered dangerous ammunition and its use was severely restricted to military systems. As early as 1997, law enforcement agencies were concerned that widespread availability of encryption technologies would make crime fighting and national security impossible, saying that: .

Uncrackable encryption will allow drug lords, spies, terrorists and even violent gangs to communicate about their crimes and their conspiracies with impunity. We will lose one of the few remaining vulnerabilities of the worst criminals and terrorists upon which law enforcement depends to successfully investigate and often prevent the worst crimes. 

The solution that was suggested is “key recovery”, which in the words of Louis J. Freeh, Director, Federal Bureau of Investigation (FBI) works like this :

Under one type of key recovery approach, a decryption "key" for a given encryption product is deposited with a trustworthy key recovery agent for safe keeping. The key recovery agent could be a private company, a bank, or other commercial or government entity that meets established trustworthiness criteria……

Good and sound public policy decisions about encryption must be made now by the Congress and not be left to private enterprise. Legislation which carefully balances public safety and private enterprise must be established with respect to encryption.

The Moxie-Perinn Solution

Intermediary key depository solution was based on the assumption that encryption technology can be controlled and shaped by government intervention. For it to work, all intermediaries (including corporations offering instant messenger solutions) must be brought under the ambit of law by forcing them to share their “decryption key” when asked by the government agencies. 

The legal cover was provided by Title – 5 of the US Patriot act, though some parts of it were struck down later as unconstitutional. Further, the Snowden revelations in 2013, also illustrated the technical capabilities acquired and developed by the US and UK governments’ to defeat encryption by inserting hidden backdoors into commercial applications and influencing encryption standards to make them vulnerable to hacking by government agencies.

For corporations which care about user privacy and security, this presented a unique problem — they can no longer trust themselves to keep user data secure and offer them privacy as their resources (both financial and technical) are not sufficient enough to ward off nation states. Their only defense is to push for advancements in encryption technology that even they can’t break in, even if ordered by the court.

The Open Signal protocol was one such advancement, developed by Mathew Rosenfeld (also known as Moxie Marlinspike) and Trevor Perrin in 2013. Notice how it approximately aligns with the year of Snowden revelations. This protocol was eventually adapted by WhatsApp in 2014.

Open Signal Protocol

A full technical treatment of the Open Signal Protocol can be found on the Signal blog and it is not comprehensible without some technical training on encryption. However, it is still possible to provide a reasonably approximate explanation for non-technical folks on how it works using a box-lock-and-key analogy.

  1. Imagine that a secret message is locked in an indestructible box which has only one key that can lock and unlock it, which is sent across by the Postal Service to the receiver.
  2. The only way for anyone to open it is to steal the key and make a copy of it and then open the box whenever it is sent, in the post office. This of course presents a problem for both the sender and the receiver.
  3. What if they come up with a scheme where every time they send a message on the box, a new lock-key combination is used. But how would the other party get the new key?
  4. An elegant way to solve the new-key problem is to write down the instructions on how to create the new lock and key as part of the current secret message itself. If boxes are never lost on transit, then every message will be in a box with a new lock-key combination.
  5. Further the old keys can be destroyed by both parties, thus avoiding the key stealing problem forever.

The Signal protocol used by WhatsApp uses the idea described above, where encryption keys used by a receiver and sender are continuously discarded and new keys are generated and used for the next message, once a previous message is sent and received successfully. This is described as a “Ratchet” (irreversible process).

Since the intermediary can never see or store these encryption keys, it would not be possible for a government agency to demand the unencrypted messages between a sender and the receiver. This is a technical barrier created by advancements in encryption technology to make it hard for government agencies to pressurize intermediaries to hand over user data through the force of law and is an impenetrable barricade for surveillance.

As Moxie describes in his blog, the intermediary has nothing to share (“When we receive a subpoena for user data and have nothing to send back but a blank sheet of paper”). 

The Rubin Problem

The powerlessness that Mr Rubin felt when his reputation was tarnished by derogatory posts in Facebook was further amplified by its failure to act. It led him to file a PIL in the Madras High Court to demand a remedy. The Madras High Court further expanded the scope of Mr Rubin’s original petition to address his grievance against Facebook and included all social media services, including WhatsApp. By doing so, it dived headlong into the hard problem of end to end encryption and the feasibility of adding traceability into the mix.

A full reading of the court proceedings, as reported by Medianama, indicate that both the court and Mr Rubin have proceeded on the assumption that it is possible to add traceability (the phone number of the original sender) into any message that can be decrypted later through a court-monitored process.

You know terrorists can also probably use it. Isn’t it a national concern then? So terrorists communicating using the same platform as army men, our uniformed services are not able to track these guys, isn’t that making the country vulnerable in that case? Am I right or wrong? Because that’s the flip side then. (Source)

You know, encryption is the whole deal. So tomorrow if the government says you will be banned if you don’t decrypt, or at least, give the genesis of the forward basically, they will definitely fall in [line]. (Source)

Whatever I have heard from experts from IIT Madras, you know people who are very tech savvy, I am not very tech savvy, they say that it is possible. It is just a structural difference to embed the number in WhatsApp. (Source)

A couple of observations on the above:

  1. Mr Rubin’s concerns about encryption are exactly the same as that of the Director of the FBI, 22 years ago. They argue for a balancing approach between functionality and surveillance and allowing for restricted access to decryption keys based on this vague classification, ignoring all the precedents in the cyber domain that such classifications are fictitious.
  2. There is reliance on outside experts from IIT Madras (specifically on Prof. Kamakoti), who are of the opinion that it is possible to achieve traceability without encryption.

The Kamakoti Solution

The affidavit filed by Prof. Kamakoti and his interview with MediaNama indicate that the solution to add traceability consists of the following:

  1. Adding an originator information with every message.
  2. A permission-based system that allows users to classify a message as forwardable and not-forwardable.

To understand why (1) above is infeasible, a deeper explanation of the Signal Protocol is required. One of the primary goals of Signal protocol is cryptographic deniability. In layman terms, it means, if Nikhil received a message from Anand, he can be absolutely sure that only Anand sent it, but he can’t prove it to anyone else, that it was indeed Anand who sent that message.

This mind contorting guarantee is possible because each set of participants (Nikhil and Anand) uses a shared secret that is known only to them. This secret is then used to create an encryption key, which is then used to encrypt the messages sent to each other. Since the key can be known to only the sender and the recipient, a third party can never be sure about which one of the two had sent it, since it could have just as easily been constructed by the recipient themselves.

The signal protocol added yet another technical innovation which took cryptographic deniability even further. It got rid of key signing and instead used unsigned one time keys to generate shared secrets which are then used to encrypt messages. The cryptographic effect of this modification is stunning. As explained in the Signal blog:

Since there are no signatures involved, anyone could take A’s public key, make up an ephemeral keypair for A (“a” in the diagram above), combine that with their own identity key and ephemeral key (“C” and “c”), and produce an entire forged transcript – even if they’ve never had a conversation with “A” before. Now anyone is capable of easily producing a forged message from anyone else, whether they’ve actually had a conversation with them before or not.

In layperson terms, what the signal protocol does is mimic whispering in one’s ear through a complex cryptographic scheme — the receiver of the whisper knows what was told to her/him and by whom, but can never prove to someone else for certain about the originator.

Given this reality, any attempt to add originator information, whether it is an ID or a phone number would not constitute proof in a court of law that the message has indeed been sent by the originator as the Signal protocol has active deception baked in, into its encryption architecture and can’t be worked around technically. Hence the claim that WhatsApp encryption can remain unchanged and adding originator information without breaking encryption into the message is possible is unsupported by the protocol design.

The proposal by Prof. Kamakoti also contains other elementary errors. It assumes that encryption and decryption of messages are done only via public key cryptography which was invented in 1976. But as the discussion above illustrates the signal protocol used by WhatsApp is much more modern, uses a combination of symmetric and public key cryptography and also the double ratchet which continuously discards encryption keys after single use.

Encryption Bans are a non-starter

A typical response by policymakers when they hear about encryption that can’t have hidden backdoors is incredulity which is then followed by banning cryptography itself. For example

  • Senators Feinstein and Burr, published a draft bill, which banned strong encryption of all form within the United States in 2016.
  • The Five eyes countries then made a statement in 2018 (Australia, New Zealand, US, UK and Canada) that they think of encryption as a big problem and ought not to be used at all.
  • The Australian government went even further and notified a legislation which required companies to handover encrypted data to them thereby providing the “hidden backdoor” approach the force of law.

Unlike a gun or other physical weapon, encryption is just a set of mathematical operations which can be done and executed on any modern computer and hence can’t be banned by appropriation or impounding.

With ready-made libraries like Signal available for instant download across all programming languages, implementing encryption is just 10 lines of code for any application developer. Hence a ban on WhatsApp, or a move to to add hidden backdoors even if it is technically feasible (it is not), will not work because it is very easy to switch to an alternate without the backdoors, or even use open messengers with PGP encryption.

Conclusion

Encryption works and the only way to bring criminals to justice is to invest on offensive capabilities which combine both signal and human intelligence. Neither banning it nor trying to pretend that cryptographers can come up with magical solutions that can offer a hidden backdoor that can be used only by law enforcement agencies will help.

A clear case on how a hidden backdoor built for law enforcement agencies was used by others is when the Chinese hackers gained control of US government officials Gmail accounts, using backdoors built for the US government. This is why the Kamakoti proposal is worrying. It offers an impossible solution as a feasible one, not through force of logic, reason, or facts.

As legal scholars have already noted, PIL jurisdiction already suffers from not applying the adversarial procedure of hearing all sides. When technical proposals that are neither peer reviewed nor vetted in the public domain are taken as gospel truth by the court, as it has no means to verify if they are even feasible, it becomes even more easier to issue orders void of reasoning.

One thing however is clear, no matter what the court rules — end-to-end encryption is a genie out of the bottle. It can’t be put back because it is an idea that can be expressed in neat mathematical formulation (and which also has the invisible hand of Ramanujan on simplifying the calculations) which anyone can execute at near zero cost.

And that idea is the freedom to talk to others without worrying about being snooped upon.

*

Anand Venkatanarayanan is a cybersecurity expert. Views expressed here are personal and do not reflect the views of his employer or of MediaNama

Note: This post is published under the CC-BY 4.0 licence. You may republish this post with credit to the author (Anand Venkatanarayanan) and a link-back to MediaNama.