What’s the news: An 80 percent accuracy in facial recognition technology (FRT) is considered “positive” enough by the Delhi police to target individuals accused of rioting, although those scoring a lower accuracy are no better off, found Internet Freedom Foundation (IFF), digital rights group, in an RTI, reported the Indian Express.
As per the records shared under the two RTI requests, the city police conducts “empirical investigation” on individuals if the FRT match has an accuracy of over 80 percent. However, even if a match has a lower score, “the police consider it a “false positive result”” which is again subject to “due verification with other corroborative evidence”.”
FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.
This means that even those with lower accuracy matches remain under the police’s list of suspects. Speaking to the newspaper, Anushka Jain, IFF associate counsel, said that people with the slightest facial similarity can be targeted due to this, which in turn could result in targeting communities that have been historically marginalized.
This concern of a religious bias in the FRT of Delhi police has been a point of contention since 2021. Earlier, Jai Vipra, Senior Resident Fellow at Vidhi Centre for Legal Policy writing for MediaNama said that Muslims are more likely to be targeted by Delhi police if FRT is used.
Why it matters: In its reply to the IFF, the Delhi police said that it is using FRT to investigate major riot incidents in the city. In recent times, many of these incidents have been related to religious conflicts like the Northeast Delhi riots. So, investigations based on even slight similarities could increase the problem of religious bias in policing. The news also comes after the passing of the Criminal Procedure (Identification) Bill that gives the police the right to collect biometrics and other data of criminal offenders. In the absence of a data protection law, this again raises the question of the impact of such surveillance on people’s privacy.
80 percent accuracy shows improved performance: While the American Civil Liberties Union (ACLU) ran a test in 2018 that showed how 80 percent accuracy on FRT was not a satisfactory threshold, the report said that the Delhi police worked with far lesser FRT confidence in the past.
In 2018, the Delhi High Court suggested the police use facial recognition software to find the women who went missing from an illegal placement agency. However, the police informed the court that the software returned with only a two percent match, which was “not good”.
It is important to consider such performances of the FRT considering the Delhi police’s eagerness to utilise the technology. In 2020, MediaNama reported exclusively about the city police’s intention to equip the police control room with FRT. Even there, the firm providing the technology said it could not comment on FRT’s accuracy in real life scenarios.
So far, the Delhi police have used FRT to identify suspects in the 2020 Delhi riots case, the clashes at the Red Fort in 2021 during the farmers’ protest and the Jahangirpuri riots shortly after Ram Navami this year. Although it declined to answer the question regarding the number of arrests using the technology. This is similar to 2020, when the city police refused to answer IFF’s questions regarding FRT-based arrests and investigations, FRT accuracy rates, privacy impact assessments, etc.
Police has not provided a reasoning for why 80 percent accuracy is considered a “positive” ID for FRT. However, this was the case even with the Telangana State Technology Services (TSTS) that provided FRT to the State Election Commission for voter authentication. Here too, the accuracy was only 80 percent but the TSTS told the IFF it did not intend to improve this figure since “it had found no bugs in the system.
No privacy impact assessment so far: The Delhi police in its reply to the RTI said that it still has not analysed how the FRT can impact privacy. Such a privacy impact assessment is not considered necessary in India although the European Union’s General Data Protection Regulation (GDPR) details such assessments as requirements.
The IFF argued in the report that the impact assessments are important because the use of such new and untested technologies by law enforcement agencies can have irreversible effects and cause damage to a person.
With even the national capital’s airport among others using the FRT, there is a growing concern about the police using such technology for surveillance. Already in Odisha’s Rourkela, facial recognition systems are being used to track people and protest patterns in crowds.
There is also the matter of the private sector’s role in providing FRT to government bodies. In January of this year, MediaNama looked at a Vidhi Centre for Legal Policy study that pointed out the lack of transparency regarding the role of such private companies. Such access to private entities in the absence of the DPB raises concerns about the usage of and safeguards for datasets.
This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.
Also Read:
- How The Debate On Criminal Procedure (Identification) Bill Unfolded In Lok Sabha
- Deep Dive: Nagaland To Use Facial Recognition For Teacher’s Attendance, But What Are The Issues At Stake?
- Major Airports Will Have Facial Recognition-Based Boarding By Next Year, Says Civil Aviation Ministry
- Exclusive: Facial Recognition In Rourkela To Track People, Detect Protest Patterns
- Delhi Police Deploys Facial Recognition Systems, CCTV Cameras Ahead Of Republic Day: Report
I'm interested in the shaping and strengthening of rights in the digital space. I cover cybersecurity, platform regulation, gig worker economy. In my free time, I'm either binge-watching an anime or off on a hike.
