The involvement of the private sector in the deployment of facial recognition technology in governmental processes is raising questions related to privacy, particularly regarding the ‘indiscriminate use of various datasets’, according to a study carried out by the Vidhi Centre for Legal Policy, a New Delhi-based think tank.
There has been a major rise in the use of artificial intelligence and deployment of facial recognition in India by local law enforcement agencies. Since many of these facial recognition projects are being undertaken by private agencies, there is a need to look into the issues that arise out of it.
Are surveillance activities being outsourced to private corporations?
In the working paper titled ‘Procurement of Facial Recognition Technology for Law Enforcement in India: Legal and Societal Implications of the Private Sector’s Involvement’, authors Ameen Jauhar and Jai Vibra, both senior resident fellows at Vidhi, pointed out that there is no transparency regarding the roles of private corporations who are involved with governments for deployment of FRT.
Private companies should not be empowered to surveil on citizens: “The opacity which shrouds the current engagements of state police forces or governments in India with limited private entities, their roles and scope of engagement, and the access they arguably can continue to have over the underlying algorithm, warrants serious questions on the plausible and dangerous merger of state functions with a private entity,” said the working paper.
What about breach of data?: The working paper pointed out in the absence of legislation such as the Personal Data Protection Bill, and due to the lack of public scrutiny of the operations of the private sector in this area, there is a concern on how datasets which are being used are being safeguarded. “In fact, in India, there have been reported instances of data leaks from FRT applications being used by local police agencies,” the paper read.
Who is legally liable for FRT, the State or private sector?
From a legal liability perspective there are three main concerns – first, the liability of the private corporation or developer of the FRT algorithm; second, the liability of the state for deploying a flawed algorithm; and third, potential of holding the algorithm liable, per se — Working Paper
Secrecy offers immunity from legal liability for private sector: The authors of the working paper pointed out that secrecy regarding operations of facial recognition technology gives ‘defacto immuity from legal liability as it is nearly impossible to build a proper case’. They also took cognisance of the challenge they faced while trying to piece together information on how state governments engaged with private corporations.
Arbitrary ecosystem: “What results from this is an arbitrary ecosystem where despite high risks of surveillance, privacy infringement, transgressions against due process, and a real threat to constitutional and legal rights, there is no meaningful recourse,” the working paper read.
Private goals are driving public properties
Profit and proliferation: The authors said that often these technologies are offered to law enforcement agencies for free trials and with cheap licenses. “While there may be nothing wrong with offering discounts to public agencies, this practice in a legal vacuum and in the absence of public deliberation reflects a mispricing of the technology,” the paper read.
In effect, not only was the technology [Clearview AI, a US-based FRT vendor] deployed without consultation with the public or without a social welfare assessment, it was also deployed at an artificially low price. This is because there is a private interest in proliferation of this technology, which motivates the low price — Working Paper
Mispricing leads to overuse of FRT: The authors said that there has been an increase in policing activities because of the artificially low prices of these technologies. “Such over-policing can lead to over-criminalisation of society,” it read.
Conflicts of interest: “Serious concerns about conflicts of interest are raised when the same people or entities are invested in FRT for law enforcement as well as data gathering for non law-enforcement purposes. In the Indian context, the private entities that are currently, or may potentially design such algorithms for law enforcement agencies, will trigger similar questions,” it said.
Increase transparency and other recommendations
In the working paper, the authors have recommended several measures that governments should follow in terms of working with private sector for deployment of facial recognition. They are —
- Transparency of agreements: The authors recommended that agreements between the public and private sector regarding FRT, should be in the public domain and open to public scrutiny.
- Algorithm: There should be legal standards on algorithmic transparency and performance, said the authors, adding that it will help avoiding manipulation of such algorithms.
- Restrictions on surveillance: The authors said that there should be a more robust mechanism to restrict surveillance and in turn balance the interests of the State, citizens and their liberties.
- Involving public in decision making: “A broader public discussion about such technology provision is required before the entrenchment of this provision in India’s law enforcement systems,” the paper read.
- Lucknow Safe City Project: Uttar Pradesh to deploy facial recognition, ‘label’ faces of suspects
- Exclusive: Major entrance examinations such as JEE, NEET, UGC-NET to come under facial recognition surveillance
- 1 lakh CCTVs at 4,000 centers with facial recognition: National Testing Agency expands surveillance of JEE, NEET exams
- MoHFW proposes facial recognition for verification of candidates sitting for exams conducted by AIIMS
Have something to add? Post your comment and gift someone a MediaNama subscription.