wordpress blog stats
Connect with us

Hi, what are you looking for?

How end-to-end encryption impacts human rights? The good and the bad

A Meta-commissioned report explores the human rights pros and cons of end-to-end encrypted messaging platforms.

“End-to-end encryption of messaging directly enables the right to privacy, which in turn enables other rights such as freedom of expression, association, opinion, religion, movement, and bodily security,” Business for Social Responsibility (BSR) said in its report on the impact of end-to-end encryption (E2EE) published on April 4.

While end-to-end encryption is being promoted by companies and civil groups for its privacy benefits, governments and law enforcement agencies have been against it because E2EE makes it hard for them to track down criminals. The BSR report lays out the arguments that exist on both sides of this debate.

BSR’s report was commissioned by Meta (formerly Facebook) in October 2019 to undertake a human rights impact assessment of extending E2EE across Meta’s messaging services–WhatsApp, Messenger, and Instagram DM–based on the UN Guiding Principles on Business and Human Rights.


Dear reader, we urgently need to build capacity to cover the fast-moving tech policy space. For that, our independent newsroom is counting on you. Subscribe to MediaNama today, and help us report on the policies that govern the internet.


What is end-to-end encryption?

Without getting into the technical aspects of it, end-to-end encryption can be seen as a way to scramble messages such that only the sender and the recipient can decipher them. Messages are encrypted using complex cryptographic methods on the device of the sender and decrypted on the device of the recipient, and even the company providing the messaging service cannot view the contents of messages.

Advertisement. Scroll to continue reading.

Currently messaging services like WhatsApp and Signal have E2EE enabled by default, Facebook Messenger offers voluntary opt-in for each message thread, and Instagram DM is testing E2EE.

In order for third parties to gain access to the contents of a message sent using E2EE, they must go directly to a party in the conversation, have physical access to the device, or have hacked into the device itself via spyware or other means, the report explained.

Why is end-to-end encryption important for human rights?

  1. Rising authoritarianism: “We are living through an age of rising authoritarianism by governments, who are placing increased restrictions on the civic space available for citizens to enjoy their rights. The 2021 Freedom House Freedom in the World report found that 2020 was the 15th consecutive year of decline in global freedom,” the report stated; because strategies and tactics of authoritarianism are taking place online through surveillance, spyware, and other methods, the privacy protection offered by end-to-end encryption is increasingly relevant.
  2. Defence against cyberattacks: E2EE is part of the “natural evolution of digital security to address increasingly technically sophisticated threats,” the report stated.”Our social infrastructure—everything from utilities to banks and healthcare services—is increasingly vulnerable to cyberattacks by bad actors. Catastrophic failures of digital systems would have a significant impact on our human rights, and widespread encryption (of both data in transit and data at rest) is one of the key strategies to prevent that failure from happening,” the report explained.
  3. Growth in sensitive communications: “We are witnessing a growth of sensitive communications taking place online, a trend that has only accelerated with COVID-19. Whether it is telemedicine, working remotely, or simply staying in touch with friends and families spread around the world, more of our private communications than ever before are taking place over platforms, apps, and services that rely on encryption to keep them secure,” the report stated.

How does end-to-end encryption increase the realisation of human rights?

  1. The knock-on benefits of privacy: The direct benefit of E2EE is privacy, but privacy, in turn, brings about other human rights benefits. “By ensuring the privacy of communications, end-to-end encrypted messaging enables people to freely form opinions, express themselves, share information, associate, and assemble without fear of retribution,” the report stated. For example, it allows community members to maintain cultural ties in contexts where their culture is socially or legally repressed. Another example is that it enables and protects labour union communication and activity in places and contexts where labour rights are restricted.
  2. Physical safety: E2EE is vital to physical safety because it keeps human rights defenders, journalists, and political dissidents safe from authoritarian governments, women safe from spying partners or family, and members of the LGBTQIA+ community safe from adversarial governments or citizens. “The consequences of malicious actors intercepting the communications of a human rights activist or a journalist investigating corruption could be arbitrary detention, bodily harm, torture or other cruel or inhumane treatment, or even death,” the report explains.
  3. Child rights: E2EE gives children increased privacy, greater opportunities for freedom of opinion and expression, and physical safety. But there are significant harms to children as well (more below).
  4. Access to remedy: The privacy protections of E2EE increase “the likelihood and security of whistleblowing, reporting, and exposing human rights violations, which thereby increases the likelihood of remedy,” the report states.
  5. Participation in government: The privacy protection allows citizens to “more freely and safely discuss and facilitate participation in government in situations where there are attempts to interfere with free and fair elections,” the report states.

What are the human rights risks that might arise from end-to-end encryption?

The report also examined the possible human rights risks that end-to-end encryption makes difficult to detect:

  1. Child sexual abuse and exploitation: E2EE inhibits the ability of platforms to effectively detect, remove, and report Child Sexual Abuse Material (CSAM), as well as content or accounts related to grooming, sexual extortion of children, child sex tourism, child prostitution, and trafficking of children, among other harms.
  2. Virality of hate speech and misinformation: E2EE has the potential to amplify and spread hate speech and mis/disinformation in a way that leads to, or exacerbates, human rights harm because viral instances of content may be challenging to detect. For example, content that intends to harass users, based on characteristics such as gender, religion, ethnicity, LGBTQIA+ status, or political views, might be shared on end-to-end encrypted messaging platforms, but not reported or removed.
  3. Malicious coordinated behaviour: Malicious coordinated behaviour, both authentic (i.e., by real people using real accounts) and inauthentic (i.e., by people using fake accounts), might be more difficult to detect and address in an E2EE environment and can undermine the integrity of social media platforms and messaging services.
  4. Illegal goods sales: E2EE can make illicit sales of weapons, drugs, or cyber-fraud services hard to detect.
  5. Human trafficking: E2EE may be used to facilitate sex trafficking, labour trafficking, organ trafficking, and child marriage. “Constantly switching between different open and closed-communications messaging platforms is a technique that traffickers use to facilitate illegal advertising, recruitment, control, punishment, and coercion of victims,” the report states.
  6. Terrorism, violent extremism, and hate groups: “Violent extremist and terrorist groups have proven to be tech-savvy and have increasingly used end-to-end encrypted messaging platforms to communicate with followers, disseminate propaganda, incite violence, and coordinate terrorist attacks that result in loss of life and bodily harm,” the report states.
  7. Violate others’ privacy: E2EE might be used to share content that violates people’s privacy, such as non-consensual intimate images.

The dilemma between balancing opportunities and risk

“This debate sets two opposing groups against each other in the name of two potentially competing human rights—privacy and security. In this debate, a “privacy side” makes the case that end-to-end encryption provides vital protections to users in an age of mass surveillance and pushes law enforcement toward more targeted and rights-respecting intelligence and evidence gathering; meanwhile, a “security side” argues that end-to-end encryption provides a safe haven for criminals, terrorists, traffickers, and child abusers, and makes it more difficult to bring these groups to justice,” the report stated.

  1. Risks largely associated with bad actors: “In contrast to the opportunities, the human rights risks of Meta’s expansion of end-to-end encryption are largely associated with the actions of bad actors using an end-to-end encrypted environment to disregard terms of service, violate the law, and adversely impact the rights of others. […]  It is also important to note that compared to the opportunities, which extend to all users of Meta’s messaging platforms, the risks of end-to-end encrypted messaging are relatively targeted,” the report stated.
  2. Sophisticated bad actors will use other E2EE platforms: If Meta decided not to implement end-to-end encryption, the most sophisticated bad actors would likely choose other end-to-end encrypted messaging platforms. “For this reason, choosing not to provide end-to-end encryption would likely not result in an improved ability to help law enforcement identify the most sophisticated and motivated bad actors,” the report explained.

“BSR’s assessment is that in and of itself, end-to- end encryption does not “cause” or “contribute” to (i.e., enable, facilitate, incentivize, or motivate) harm because nearly all the adverse human rights impacts that could be attributed to end-to-end encryption already occur in non-end-to-end encrypted messaging.” – BSR report.

The “slippery slope” risk of CSAM detection

One of the most challenging debates related to end-to-end encryption is whether companies should scan messages to detect and report child sexual abuse material, the report stated.

  • What is happening currently on unencrypted platforms: “Currently, Meta scans messages to detect and report known CSAM on its unencrypted messaging platforms as part of an industry-wide effort in collaboration with government authorities and civil society. It also scans unencrypted content in WhatsApp such as profile photos, group names and descriptions, and user reports, which it will still be able to do with the expansion of end-to-end encryption. While scanning and removal of known CSAM is not a catch-all solution for preventing child sexual abuse online, it is a mitigation against the revictimization of pictured victims and assists in the identification of those distributing CSAM,” the report stated.
  • Scanning can continue in E2EE environments using hash-based solutions: To continue to scan message content in an end-to-end encrypted messaging context, platforms would need to use one of several nascent hash-based solutions often collectively referred to as “client-side scanning.”
  • But, such solutions can be misused: “Government regulation of online content has grown enormously in recent years, both in the legitimate pursuit of safe and rights-respecting online spaces and the illegitimate pursuit of censorship and oppression. Therefore, even if cryptographic integrity-maintaining client-side scanning for CSAM in end-to-end encrypted messaging were technically feasible, there is a risk that this capability could be abused by governments to require Meta to block and report legitimate content that a government dislikes,” the report explained.
  • Hash-based systems do not deal well with nuanced content: Since hash-based systems rely on having an exact or near-exact copy of the content that has been hashed, this makes dealing with nuanced content very difficult. “As a result, seeking to moderate content such as hate speech or harmful dis/misinformation, would likely result in the removal of too much legitimate content, constituting an undue burden on freedom of expression,” the report stated.
  • Only clearly violating content should be proactively moderated: “There is currently no consensus on where to draw the line on content moderation in a messaging context. With the existence of large group messages, messaging platforms can sometimes seem like a quasi-public space and face many of the same content issues seen in open social media platforms. However, messaging is still largely a private space, and moderation of anything other than content that always and clearly constitutes a human rights violation (such as CSAM) would be an unnecessary and disproportionate infringement on privacy and freedom of expression,” the report stated.
  • More research should be done in this area: The report concluded that “Meta should continue investigating client-side scanning techniques to detect CSAM on end-to-end encrypted messaging platforms, in search of methods that can achieve child rights goals in a manner that maintains the cryptographic integrity of end-to-end encryption and is consistent with the principles of necessity, proportionality, and nondiscrimination.” Homomorphic encryption may potentially meet these requirements because it allows for the processing of data in its encrypted state, but this is not technically feasible to implement in messaging at scale, the report stated.

This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Also Read:

Have something to add? Subscribe to MediaNama here and post your comment. 

Advertisement. Scroll to continue reading.
Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

Due to the scale of regulatory and technical challenges, transparency reporting under the IT Rules has gotten off to a rocky start.

News

Here are possible reasons why Indians are not generating significant IAP revenues despite our download share crossing 30%.

News

This article addresses the legal and practical ambiguities in understanding the complex crypto ecosystem in India.

News

It is widely argued that the PDP Bill report seeks to discard the intermediary status of social media platforms but that may not be...

News

Looking at the definition of health data, it is difficult to verify whether health IDs are covered by the Bill.

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ