wordpress blog stats
Connect with us

Hi, what are you looking for?

Tamil Nadu Police nab criminal using face recognition app, questions of safeguards and accountability remain

On a chilly November night, Balaram was moving on a two-wheeler when he was stopped at a vehicle checkpoint by cops in Thiruvallur district. The cops felt something was amiss, their intuition told them Balaram was a potential criminal. They clicked a picture of his face and fed it into a facial recognition app available to them. Indeed, Balaram was present on the criminal database, and had been booked on charges of theft and robbery. He was arrested.

“We caught him [Balaram] during the night earlier in the month while the beat cops under the jurisdiction of Arani Police Station were conducting vehicle checking. Balaram was driving a two wheeler, and the police stopped him, scanned his face and the app returned with his previous criminal history which was present in our database,” Aravindhan P, Superintendent of Police at the Thiruvallur district in Tamil Nadu told MediaNama.

The facial recognition app used by the police is called FaceTagr. Several other police jurisdictions in Tamil Nadu also have access to this app. Most notably, the Chennai police has been using it for several years, and had also caused controversy when it was seen scanning faces of people protesting against the controversial Citizenship Amendment Act earlier this year. This is also in line with how police departments across the country are using facial recognition systems to identify potential criminals (here’s how the Delhi Police and Telangana Police go about it).

Aravindhan told us that police in his district has been using the app for over a month now. For the Thiruvallur district, of which Aravindhan is in charge, the police have a facial dataset of criminals which includes more than 60,000 images. The entire database is hosted on an Amazon Web Services (AWS) server, Aravindhan told us.

There are 29 Police stations in the Thiruvallur district, and each police station has 3 beat cops, all of whom have access to the FaceTagr app, Aravindhan told us. “In all, I’d say roughly about 200 police officials, including inspectors and sub inspectors also have access to the app”.

Advertisement. Scroll to continue reading.

‘We don’t need a warrant to scan anyone’s face’

When we asked him how many face scans are carried out by the police in a given day, Aravindhan didn’t directly respond to our question; instead, he said the police only uses the app when it “feels someone might be a potential criminal”. “The app is capable of carrying out one match in every 10 seconds, so if a person is doing vehicle checking in the night, you can imagine the number of scans they would be able to carry out, without slowing down the traffic too much. However, we don’t insist on scanning everyone’s face, we only do when we feel someone might be a potential criminal”, Aravindhan said.

How does a cop know who is a “criminal”? Aravindhan’s response signals the amount of discretion available to on-ground police personnel. Essentially, whether the police choose to deploy facial recognition on someone depends entirely on how this particular person looks.

“That is based purely on a police official’s experience, and the various inputs he might have received during his investigation,” Aravindhan told us when we asked him how the police determines who might be an actual criminal. “That is the only way we can square off a potential criminal”.

Also, the police doesn’t need any warrant before seeking a person’s facial data. “No, we don’t require any warrant. We are only taking the picture of a person, and running it against a criminal database, and not anything more. If the app returns with results of a criminal history for a face match, we arrest that person. But if the app doesn’t show a match, then it doesn’t store the picture of the innocent person either in the police’s smartphone, or in our database,” Aravindhan claimed.

But what if someone refuses to give consent? When asked what happens when a person refuses to undergo the facial scan, Aravindhan informed us that in that case, the police takes their fingerprints, address, phone number and any ID proof as per Section 41 of the CrPC.

“The app shows results for a particular face scan in terms of percentage match. So, if the picture we’ve input using the app has a percentage match of less than 80%, we deem the person to be innocent. However, for anything more than 80%, we consider questioning the person,” Aravindhan added.

Here’s a video of the FaceTagr app in action.

‘There is a possibility of abuse, but we have enough oversight’

Aravindhan conceded that in the hands of a rogue police official on the ground, the facial recognition app can indeed be abused. However, to make sure that doesn’t happen, he claimed, “we constantly monitor how police officials are using the app at a central control centre”. The control centre is located at the SP office in Thiruvallur.

Advertisement. Scroll to continue reading.

“We can see how many photographs a particular police official has taken, and whose photographs they have taken. We also have a rough average of the number of pictures each police official takes daily. So, for instance, if a certain police official is averaging three pictures a day, and all of a sudden one day he takes 10 pictures, he will be questioned on why he felt the need to take those additional photographs,” Aravindhan said when asked about how the police ensures there’s no abuse of the app.

All this data about how the app is being used is accessible via a dashboard at the control centre, and only 3 people — the SP, and two ADSPs have login access to that dashboard, he told us, and added: “so far, we have not spotted a single instance where a police official might have used the app to target an innocent person”.

He also said that the officials using the app have been properly trained to use it. “We have conducted a joint training which included police officials and representatives of FaceTagr,” he said.

“The people from FaceTagr explained the technical aspect of the app. We have explained them to be sensitive while using the app, and not to use it on women and children. That is also because most of the people in our database are men, so there is no point in scanning women and children’s faces. We have also explained them how to use the app on the ground, what type of lighting to use, to always take a mugshot etc.” — Aravindhan P, Thiruvallur SP

‘Exploring how we can use facial match on the app as evidence in court’

So far, the police is only using the app as part of investigations, however, Aravindhan feels there’s a need to allow facial matches to be admissible in courts as evidence. To that end, he said:

“[…]we are exploring how we can use the Indian Evidence Act in a way that will allow us to use facial matches from the app as evidence admissible in court.”

However, “the Evidence Act permits expert evidence to be submitted by forensic experts, including on methods like DNA sampling or fingerprinting. There is no established forensic verification technique for the veracity of FRT, so it’s unclear how this will play out in the context of a trial,” Divij Joshi, an independent lawyer, researcher, and tech policy fellow at Mozilla told MediaNama. “In my opinion, FRT matches may be admissible as fact but its relevance in identification should depend on established scientific and forensic practice, which I do not believe FRT currently possesses in India,” he added.

Concerns of inadequate safeguards and a lack of accountability

Facial recognition systems, globally, have been known to have biases, especially against people from underrepresented communities. When a Twitter user told Aravindhan that facial recognition systems have shown biases against Asian people, citing a BBC report, Aravindhan responded by saying that “we are using a made in India technology which is customized for Indian faces”.

Advertisement. Scroll to continue reading.

With regard to the police’s discretionary use of the technology, Joshi of Mozilla Foundation said, “Policing intuition is important in practice, but depending entirely on their ‘knack’ is a recipe for systematic discrimination and abuse”. He argued that police persons have to give justifications for why they stop and conduct these ‘digital searches’. “We have seen, like in Hyderabad, how the absence of these rules leads to indiscriminate use of these technologies which violates the dignity of people, and is especially targeted at marginalised groups”, Joshi added.

Instead, he argued that there should be “clear procedures under the police manuals about the procedure to be followed when collecting and matching photographs for FRT. There should be audits of the technology to ensure its scientific validity in live contexts, apart from reliance on standards. Further, training police in the use and limitations of this technology should be imperative, if at all it is considered for use”. It is telling that in this particular instance, a lot of these safeguards aren’t in place.

When we asked him whether the police is legally required to obtain a warrant before collecting a person’s facial data, Joshi said: “Under the Identification of Prisoners Act, 1920, the collection of bodily measurements, such as fingerprints or photographs, of persons not convicted or arrested in connection with a criminal offence, requires a specific judicial order from a magistrate, if it is for the purpose of an investigation. It is unclear whether AFRS falls within this domain, although there are efforts to explicitly include it within this law”.

“Unlike the collection of biometrics like fingerprints, which are regulated under police laws, there is no explicit law about the use and collection of biometric information without a warrant,” he added.

Read more from our coverage related to facial recognition:

Advertisement. Scroll to continue reading.
Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Do we have an enabling system for the National Data Governance Framework Policy (NDGFP) aiming to create a repository of non-personal data?


A viewpoint on why the regulation of cryptocurrencies and crypto exchnages under 2019's E-Commerce Rules puts it in a 'grey area'


India's IT Rules mandate a GAC to address user 'grievances' , but is re-instatement of content removed by a platform a power it should...


There is a need for reconceptualizing personal, non-personal data and the concept of privacy itself for regulators to effectively protect data


Existing consumer protection regulations are not sufficient to cover the extent of protection that a crypto-investor would require.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ