wordpress blog stats
Connect with us

Hi, what are you looking for?

Tech companies involved in facial recognition for policing raise privacy concerns: Report

Here’s a look at the lack of transparency and other dangers posed by private sector’s role in public facial recognition projects.

The involvement of the private sector in the deployment of facial recognition technology in governmental processes is raising questions related to privacy, particularly regarding the ‘indiscriminate use of various datasets’, according to a study carried out by the Vidhi Centre for Legal Policy, a New Delhi-based think tank.

There has been a major rise in the use of artificial intelligence and deployment of facial recognition in India by local law enforcement agencies. Since many of these facial recognition projects are being undertaken by private agencies, there is a need to look into the issues that arise out of it.

Are surveillance activities being outsourced to private corporations?

In the working paper titled ‘Procurement of Facial Recognition Technology for Law Enforcement in India: Legal and Societal Implications of the Private Sector’s Involvement’, authors Ameen Jauhar and Jai Vibra, both senior resident fellows at Vidhi, pointed out that there is no transparency regarding the roles of private corporations who are involved with governments for deployment of FRT.

Private companies should not be empowered to surveil on citizens: “The opacity which shrouds the current engagements of state police forces or governments in India with limited private entities, their roles and scope of engagement, and the access they arguably can continue to have over the underlying algorithm, warrants serious questions on the plausible and dangerous merger of state functions with a private entity,” said the working paper.

What about breach of data?: The working paper pointed out in the absence of legislation such as the Personal Data Protection Bill, and due to the lack of public scrutiny of the operations of the private sector in this area, there is a concern on how datasets which are being used are being safeguarded. “In fact, in India, there have been reported instances of data leaks from FRT applications being used by local police agencies,” the paper read.

Advertisement. Scroll to continue reading.

Who is legally liable for FRT, the State or private sector?

From a legal liability perspective there are three main concerns – first, the liability of the private corporation or developer of the FRT algorithm; second, the liability of the state for deploying a flawed algorithm; and third, potential of holding the algorithm liable, per se — Working Paper

Secrecy offers immunity from legal liability for private sector: The authors of the working paper pointed out that secrecy regarding operations of facial recognition technology gives ‘defacto immuity from legal liability as it is nearly impossible to build a proper case’. They also took cognisance of  the challenge they faced while trying to piece together information on how state governments engaged with private corporations.

Arbitrary ecosystem: “What results from this is an arbitrary ecosystem where despite high risks of surveillance, privacy infringement, transgressions against due process, and a real threat to constitutional and legal rights, there is no meaningful recourse,” the working paper read.

Private goals are driving public properties

Profit and proliferation: The authors said that often these technologies are offered to law enforcement agencies for free trials and with cheap licenses. “While there may be nothing wrong with offering discounts to public agencies, this practice in a legal vacuum and in the absence of public deliberation reflects a mispricing of the technology,” the paper read.

In effect, not only was the technology [Clearview AI, a US-based FRT vendor] deployed without consultation with the public or without a social welfare assessment, it was also deployed at an artificially low price. This is because there is a private interest in proliferation of this technology, which motivates the low price — Working Paper

Mispricing leads to overuse of FRT: The authors said that there has been an increase in policing activities because of the artificially low prices of these technologies. “Such over-policing can lead to over-criminalisation of society,” it read.

Conflicts of interest: “Serious concerns about conflicts of interest are raised when the same people or entities are invested in FRT for law enforcement as well as data gathering for non law-enforcement purposes. In the Indian context, the private entities that are currently, or may potentially design such algorithms for law enforcement agencies, will trigger similar questions,” it said.

Increase transparency and other recommendations

In the working paper, the authors have recommended several measures that governments should follow in terms of working with private sector for deployment of facial recognition. They are —

Advertisement. Scroll to continue reading.
  • Transparency of agreements: The authors recommended that agreements between the public and private sector regarding FRT, should be in the public domain and open to public scrutiny.
  • Algorithm: There should be legal standards on algorithmic transparency and performance, said the authors, adding that it will help avoiding manipulation of such algorithms.
  • Restrictions on surveillance: The authors said that there should be a more robust mechanism to restrict surveillance and in turn balance the interests of the State, citizens and their liberties.
  • Involving public in decision making: “A broader public discussion about such technology provision is required before the entrenchment of this provision in India’s law enforcement systems,” the paper read.

Also read:

Have something to add? Post your comment and gift someone a MediaNama subscription.

Written By

Among other subjects, I cover the increasing usage of emerging technologies, especially for surveillance in India

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

While the market reality of popular crypto-assets like Bitcoin may undergo little change, the same can't be said for stablecoins.

News

Bringing transactions related to crypto-assets within the tax net could make matters less fuzzy.

News

Loopholes in FEMA and the decentralised nature of crypto-assets point to a need for effective regulations.

News

The need of the hour is for lawmakers to understand the systems that are amplifying harmful content.

News

For drone delivery to become a reality, a permissive regulatory regime is a prerequisite.

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ