American businesses that sell surveillance products and services could refuse to serve Indian government if they choose to follow the due diligence guidance released by the US State Department and undertake a human rights review recommended in it. Of the 22 laws and policies that the State Department has highlighted as red flags that make a surveillance product susceptible to human rights abuse, India has enacted at least 14 of them. These include mandatory SIM registration, curbs on free media, persecution of dissidents, tracking people’s online activities, and a nationwide facial recognition programme amongst others.

However, the key point to remember is that the document — officially named as the Guidance on Implementing the UN Guiding Principles for Transactions Linked to Foreign Government End-Users for Products or Services with Surveillance Capabilities — is not a legal or regulatory requirement under the American law. It is meant to guide businesses that do not require US government’s authorisation for export but still want to carry out a human rights review.

What products are included in the guidance?

It covers all products or services that have intended or unintended surveillance capabilities that include the ability “to detect, monitor, intercept, collect, exploit, preserve, protect, transmit, and/or retain sensitive data, identifying information, or communications concerning individuals or groups”.

It includes sensors, biometric identification, data analytics (social media analytics software, predictive policing systems), internet surveillance tools (spyware, penetration testing tools, jailbreaking tools, etc.), non-cooperative location tracking (products used to track people’s locations without their knowledge and consent, cell site simulators, automatic licence plate readers), recording devices (body-worn or drone-based, CCTV cameras, recording devices that can remotely transmit or be remotely accessed).

What is the aim of the guidance then?

It is to help US business take a call on whether or not they should continue to do business with foreign governments that “will likely misuse the product or service to carry out human rights violations or abuses”. It is meant to help the businesses identify products/services that can by misused to carry out human rights. It covers US businesses that work with or design/manufacture surveillance products.

What qualifies as misuse?

Misuse use could include subjecting entire populations to arbitrary or unlawful surveillance, violating or abusing the right to be free from arbitrary or unlawful interference with privacy as set out in the Universal Declaration of Human Rights (UDHR) and the International Covenant on Civil and Political Rights (ICCPR).

Misuse could also include steps taken to stifle dissent, harass human rights defenders, intimidate minority communities, discourage whistleblowers, chill free expression, target political opponents, journalists and lawyers, etc. The guidance argues that violation of right to privacy could result in abrogation of other human rights such as freedom of expression, hold hold opinions without interference, freedom of association and peacefulassembly, and religion or belief.

Why could businesses cease doing business with the government of India?

The document has given a litany list of laws, regulations and policies that exemplify red flags from a human rights perspective. These include criminal punishments for for “political/anti-government” and “anti-national” online content, curbs on independent press, use of city-wide or national surveillance or data collection technology without appropriate safeguards, requiring cyber cafes to track clients’ online activities, broad definitions of terrorism and extremism to legitimise arbitrary and unlawful interception of platform users, mandatory SIM card registration, national/regional facial recognition programs that target individuals for exercising their human rights, restrictions/limits/bans on foreign funding of NGOs, data localisation requirements without safeguards against abusive government search and seizures, etc.

Using the Unlawful Activities (Prevention) Act and by citing national security, the Indian government has jailed journalists over the last few years for their tweets. The government is in the process of building a nation-wide Automated Facial Recognition System which is a centralised, searchable database of images of people’s faces. If that was not enough, Telangana used facial recognition to deal with impersonation in its civic elections.

India’s Cyber Café Rules 2011 make it mandatory for cyber café owners to maintain a record of all users who access their services and have an online log of people’s activities including a history of websites accessed and a log of proxy servers installed. To get a new SIM in the country, subscribers have to furnish certain identity documents.

As it is, foreign funding to NGOs is routed through the Home Ministry. Under the Foreign Contribution (Regulation) Amendment Act, 2020, which was recently passed by both houses of the Parliament, the government has granted itself unfettered power to investigate organisations that receive foreign funds. These amendments have been seen by multiple activists and lawyers as a means to chill opposition and criticism of the government in the country. And the amended act has already claimed a casualty — Amnesty India.

Featured in the draft e-commerce policy, Personal Data Protection Bill, report on non-personal data, RBI guidelines for payment service providers, amongst a host of other policies, data localisation has emerged as Indian government’s pet project. The aim is to ostensibly protect citizens’ data and to give law enforcement agencies easier access.

And till date, the Indian government has not denied purchasing Pegasus to spy on lawyers, activists and journalists in India.

The guidance document also red flags policies that prohibit anonymous profiles on online messenger applications and social media platforms. While there is no Indian law that prohibits anonymous accounts, the Indian government is pushing to introduce traceability on end-to-end encrypted communication platforms which could effectively sublimate anonymity.

What would due diligence include?

In line with the UN Guiding Principles, it would include:

  • Assessing and addressing risk: amount and depth of due diligence would be determined by the severity and likelihood of an adverse impact, depending on the type of product/service and the end user’s operating context.
  • Ongoing assessment of monitoring and evaluation: iterative, responsive and adaptable processes that monitor, evaluate and take feedback to check the effectiveness of existing mechanisms.
  • Stakeholder engagement: ongoing communication with stakeholders whose human rights could be affected by the business’s activities. The guidance gives a list of NGOs and international organisation such as Access Now, Citizen Lab, Amnesty International Human Rights Watch, etc. for this.
  • Public communication: At least annual communication of the business’s commitment to a rigorous internal and external review of human rights risks to adequate measures to address these risks.
  • Grievance mechanism: Establish secure, accessible, and responsive communication channels for internal and external reporting of possible misuse of a product or service.
  • Align with human rights instruments such as UDHR, ICCPR, OECD Guidelines and the UN Guiding Principles.

How will businesses evaluate the human rights impact of a potential transaction?

They could consider factors such as assessing whether the primary purpose or the inherent capability of the product/service is to collect sensitive data that can be linked to an individual or analyse datasets to derive sensitive insights about identified or identifiable individuals. Companies should also consider whether this can be done without modification, irrespective of their design or intended use. The businesses will also have to assess if it’s a unique or custom ability or widely available from other suppliers.

This guidance raises ethical questions around the use of open-source intelligence (OSINT), a service offered by multiple companies around the world, that analyses all open source data such as social media accounts, etc. to create digital dossiers of people.

What should be red flags for businesses?

The document has identified certain red flags that can come up during basic due diligence which would warrant further assessment or due diligence. Depending on the severity of the information found, the business can consider terminating business with the foreign government. Red flags include:

  • Past history of the buying government: This includes the government’s human rights records in general and instance where a similar product or service has been used by the buying government to commit human rights violations/abuses. It also includes the purchasing foreign government agency misusing the product/service for something other than a legitimate law enforcement purpose. Here, proximity between the actual agency that purchases the product/service and the agency that violated human rights in the past is an important consideration. If the foreign government has a history of exporting products/services to other countries that violate human rights, that’s a red flag.
  • Legal and regulatory framework: Laws, regulations or policies of the foreign government that unduly restrict civic space or target members group solely on the basis of race, sex, language, religion, political opinion, national origin, or any other grounds inconsistent with international human rights law. Also, laws that give disproportionate access to information and communications technology company data without reasonable safeguards and appropriate oversight are a red flag. Similarly, policies/laws, including national security and counterterrorism related laws, that “appear to unduly” restrict freedom of expression or privacy are a red flag. Similarly, lack of legislation governing government access to communications is a red flag.
  • Ongoing conflict in the region where the transaction is supposed to occur.
  • Persecution of individuals and groups: Ongoing abuse or arbitrary detention of members of minority groups, civil society members, or journalists as well as history of collecting unlawful data against dissident groups is a red flag.
  • Lack of independent judicial oversight or rule of law
  • Data localisation requirements
  • Transaction includes products/services that can be used to build or customise a system that is known to be used to commit HR violations.

What if the business sells surveillance products despite risks?

The documents advises tailoring the product/service “to minimize the likelihood” that it will be misused. This includes:

Feature constraints such as integrating safety, privacy- and security-by-design features such as mechanisms for individuals to report misuse of the product, removing certain capabilities before sale, preventing interconnected products from being misused, limiting use to authorised purpose, limiting upgrades, software updates and direct support that enhance or provide new surveillance features, and providing for data minimisation.

Contractual and procedural safeguards such as:

  • Placing conditions on intellectual property associated with the use of the product/service.
  • Including human rights safeguards language in contracts that is specific to the risks identified or associated with the product/service.
  • When the ultimate end user may not be known, businesses should require end-user licence agreement with human rights safeguards language and require resellers to conduct their own human rights due diligence in case of resale.
  • Including end-use limitations, clauses requiring end users to agree to comply with applicable US export control laws and regulations, and limitations on how the product or service can/cannot be used.
  • Restricting how and by whom collected data is analysed, stored, protected, and shared

Retain seller’s rights and features such as:

  • Right to terminate access to technology, to deny software updates, training, and other services, and to unilaterally terminate the contract if the seller uncovers evidence that the technology is being misused.
  • Adopt features and contractual provisions that allow the seller to terminate access to product if required. This could include cloud-based access to the product/service instead on on-premises installation, periodic renewal of license keys, routine human rights due diligence training to all employees involved in the transaction.

Grievance mechanisms that allow individuals to confidentially and/or anonymously report misuse of products. Ensure that the reporter’s safety is not compromised and regularly review the communication channel to ensure its effectiveness. All complaints of misuse must be thoroughly investigated.

Publicly report on sale practices and on credible complaints, incidents and resolutions without jeopardising the safety and security of the whistleblower. This report can be published on a website or in the public annual report.