wordpress blog stats
Connect with us

Hi, what are you looking for?

Mass Surveillance on Wheels? India’s Railway Coaches to Get Face-Detecting Cameras

Indian Railways has aired a tender to install two to four cameras at the entry-exit doors of the coaches, which will be equipped with a face image cropping tool to capture the faces of adults and children alike. MediaNama speaks to experts about the privacy concerns regarding this mammoth project.

Indian Railways plans to install face-detecting CCTV cameras across 44,038 coaches in the country. The cameras are to be installed across four regions of Indian Railways (Central Railways, Western Railways, Eastern Railways, and 12 other railway lines) which are further sub-divided into groups with 4,5,6, and 8 cameras per coach. What’s interesting is that these cameras will collect faces of both adults and children, raising concerns of privacy as well as the possibility of a social credit system in India.

Here’s how cameras are going to be distributed across railway lines: Overall, there will be 38,255 coaches with eight cameras, 2,744 coaches with five cameras, 2,079 coaches with four cameras and 960 coaches with six cameras. Of these, Central Railways will have 3,018 CCTV coaches, Western Railways will have 3,408 CCTV coaches and East Central Railways will have 2,533 CCTV coaches. The tender notes that these figures are estimates and may change during implementation. Holding of coaches and ownership by a particular Depot is totally tentative and may change at any time at sole discretion of Railways.”

Passengers’ faces will be scanned at the entry and exit of every coach

As per the tender, two to four cameras installed at the entry-exit doors of the coaches will be equipped with a face image cropping tool. This feature will help officers identify faces of people from the live feed and then crop it as required. The cropped image can then be run through a facial recognition system and will later be sent to the cloud along with its metadata. The system assumes a minimum of 100 new faces per hour per coach. Further, the software is expected to face-match individuals with and without a mask or other accessories like sunglasses, scarves, as long as 50 percent of the face is visible.

Faces to be matched against data of criminals: Another feature of the FRS-enabled camera is to face match with pictures of people the officer may be looking for. For example, the tender talks about matching the faces caught via the camera with the criminal database. However, as the images in the database may be many years old, the system will use reference images that may be as old as 5-10 years old. This feature will be available on the web application as well as mobile app versions for Android and iOS phones. Further the FRS must be able to work with face images at multiple angles and search for images in the criminal database based on name, date range and criminal ID/number. Lastly, the tender calls for a 95 percent accuracy rate for individual faces with masks and a 99 percent accuracy rate for faces without a mask.

What’s the problem with this? No facial recognition system can match faces with such high accuracy. Even the Delhi police, in answer to a query raised by the Internet Freedom Foundation (IFF), said that their facial recognition system for detecting rioters works with an 80 percent accuracy at best. Even the government’s own ASTR system that uses facial recognition to address the issue of duplicate SIMs cannot detect two photos of the same person taken after a 15-year gap. Even outside India there have been cases of racial bias by a facial recognition system.

Not to mention, even the 99 percent face-match would be comparing a front-facing photo of a criminal from a government database while the real-time image would be the image of a railway commuter from any angle, showing at least 50 percent of their face. Such a comparison does not guarantee a 100 percent match of the person in the coach with the person the officer is hunting for.

Railways expressed plans to collect children’s facial data as well

Aside from adults, the tender states cameras will also collect photos of children in the railway coaches. Under the Digital Personal Data Protection Act (DPDPA) 2023, the collection and processing of children’s personal data require parental consent. However, there isn’t enough clarity on what happens if the data is collected in a public place like a railway coach.

According to Nidhi Sudhan, co-Founder of Citizen Digital Foundation, when it comes to collecting data like facial data of children from trains, it would most likely be put under the provision of legitimate use.

“Under the DPDPA, this is where it gets a little blurred and murky. So even if it is children’s data, the government can always get away saying that it is for national security purposes because they are doing it from a crime perspective. That is the unfortunate bit about it, because in the absence of a security system for how and where this facial recognition data is stored, and in the absence of a transparency protocol showing how the data is processed after that and how long it is told, because we don’t know whether there are necessity and proportionality clauses being introduced or aspects being taken care of with regard to the data. Once your data is given, it’s given, and we don’t know where it’s going,” said Sudhan.

The collection of facial data under the legitimate use provisions also exempts the government from such legal obligations. However, Sudhan argued that it is completely counter-intuitive to the purpose of the DPDPA itself which is to protect children’s data.

Facial scanning may help curb crimes via trains: Flipping the coin on the other side, Ritesh Bhatia, a cybersecurity expert, had another take on collection of children’s facial data. He argued that scanning faces of children may be unavoidable if the intention is to address crimes committed via trains. He gave the example of child trafficking wherein young children, who may be reported missing in other regions, are transported via trains.

“If we are looking from the scanning point of view of the children, I can’t say that we shouldn’t be doing it [using facial recognition and collecting facial data]… when we are looking from [the perspective of] child trafficking in the moment, let’s say the system pops up, we’ve found this particular child, and then it gives us some indications [as to where the child was spotted by the system] that’s a great thing. But again, we need to be extremely careful about the false positive,” he said.

The danger of a false positive in this case would not be simply because of a low accuracy rate but due to the fact that children’s faces are often harder to differentiate. He warned that this can create a false alarm among the public, especially if the child and the people with her are made to deboard the train.

“Just imagine the trauma the child has to go through or the mother or father. What if the child is just travelling with his uncle?” he said.

No clarity on how to erase children’s data: “The right to remove your data has to be initiated by the subject or the principal right, the data principle itself, which in today’s India, we don’t have. End users don’t have the awareness of their rights enough that they can have their data removed. So especially in case of minors, if you’re having a minor’s face being scanned when they’re entering a train, the parent has to be at least aware that they can ask for a removal of the child’s data [once the purpose is served]. But I doubt if many parents know of this or they have the bandwidth to even apply for a removal of data. Well, that is secondary. But primarily, we don’t know that there’s a request for removal option at all. If at all there is a request for removal option. Where are the processes? How do we go about it?” said Sudhan.

Even during MediaNama’s PrivacyNama 2023 event, where Sudhan was a panelist, experts talked about the collection of children’s data in public spaces at length but couldn’t come up with a conclusion beyond ‘if a parent fails to provide the required data, they will have failed their user duty and face the resulting penalty.’ This is because under the DPDPA, any decision on behalf of a minor under 18 years of age has to be done by the parent, whether it is providing consent for use of platform or withdrawing consent.

In terms of protecting children’s data, Sonali Patankar, from Responsible Netism, pointed out that the government needs to consider whether its system is equipped to hold such personal data of children. She gave the example of how data of school children is with the government and that while there are mechanisms to control the data, there is no awareness of what needs to be done.

Will social credit system become a reality in India?

The collection of such vast data by the Indian Railways and its accessibility with the government also raises the question of a potential social credit system in India. Popularly associated with China, a social credit system is one where a person’s or entity’s behaviour within a society earns a “social credit scoring.” For example, obeying railway platform rules earns you five points, boarding a train without a ticket deducts five points from your credit. For this, the government in power would require a vast surveillance system to monitor citizens and entities.

According to Radhika Roy, Litigation Counsel at Internet Freedom Foundation, such a system can definitely become a possibility and “cannot be discounted.” She said, “Social credit systems are already the norm in China. With the rampant implementation of Aadhaar and having it be connected with every form of identification possible, we are already heading down the path of using data to discriminate. Machines do not have a brain of their own and are fed datasets by people – the biases that people inhabit are replicated in the system. For minorities and marginalised communities, this spells trouble and can lead to them being restricted from accessing basic amenities, including being allowed to board a train on the basis of the community to which they belong.”

On the other hand, Shivangi Narayan, an independent researcher with expertise in facial recognition technology and its use in policing in India, was sceptical about the system becoming a reality. She pointed out that while India does have enough information to make the social credit system a reality, China has a lot more infrastructural capabilities compared to India.

“My hunch is that it [creation of a social credit system] won’t happen because a lot of people who vote for these governments will not agree to it because they won’t have that kind of infrastructure available or they don’t want to get into a system and be treated like some sort of a criminal, but we don’t know. China has a lot more infrastructural capability for having such systems, a lot more people are connected to the Internet, electricity, all sorts of systems through which they can be electronically monitored,” she said.

In the context of children, Sudhan also pointed out that such a system can also lead to profiling people from a very early stage in life. “Only in authoritarian situations, it could have profiling and surveillance consequences, especially for young voters. Children who are going to become voters in the coming years can be profiled and targeted through personalised content. I’m not even saying advertising personalised content can be served to young children to influence their ideologies and thinking towards voter behaviour. So these are all probabilities,” she said, arguing that the consolidation of facial and age-related data is already being done.

How much facial recognition surveillance is required for railway security?

In August, 2023 the Railways Department announced it is working on FRS at all major stations at the East Central Railway, to “enhance security measures”, while linking data collected through FRS to existing database of criminal activities in and around the railway station premises. At the time, the department had identified at least 200 stations for a complete security overhaul, of which a few stations fall under the ECR jurisdiction.

Similarly, in November 2022, the Western Railways had boasted of its new CCTV surveillance network backed with FRS that had helped identify missing persons and detect crimes against passengers. However, when MediaNama filed an RTI about:

  • police access to the data for crime investigation,
  • tender requiring the supply, installation and implementation of the FRS CCTV system
  • details on data storage
  • data retention period
  • number of thefts prevented due to FRS CCTV cameras
  • number of thefts resolved due to FRS CCTV cameras
  • number of missing people found between July 2022 and November 2022 due to FRS CCTV cameras

The department replied by saying that it does not maintain such information. So, how did it have the confidence to make claims about reduced crime? And where’s the confidence now to want to further implement such surveillance in other parts of India?

Central agencies may now have eyes on railway users

Data collected by Indian Railways is also connected to NATGRID or the national intelligence grid, that links multiple public and private databases and makes the data available to intelligence agencies. This means that the facial data collected via these cameras may also become accessible under NATGRID.

Roy said that the usage of such datasets by NATGRID and the unregulated access raises concerns about India’s foray into ‘state-sponsored mass surveillance.’

“Being the largest daily traffic carrier, data from Indian Railways will essentially aid the government to keep a tab on the movement of millions of people. This is a gross invasion of privacy without any accompanying procedural safeguards. It entails viewing every citizen with suspicion and inhibits one’s right to movement. Apart from these, some other concerns include misuse and misplacement of the data. The DPDPA does not envisage any safeguards whatsoever and in fact encourages sharing and usage of such data for law enforcement purposes. Additionally, FRS is not error-free, as has been proven in a countless number of research papers. Relying on this data can lead to incorrect conclusions,” said Roy.

Building on this concern, Narayan pointed out that such a form of surveillance can also hinder civil protests as peaceful protesters may be barred from boarding a certain train.

They can obviously stop, censor [a protester]. Anything can happen… If you have read anything about public places with CCTV cameras and how it hung on your personal safety and security, all of that applies. But what I have started noticing nowadays, there is a big collusion between the private and the public sector. What they’re doing right now is pushing these systems because it gives them a lot of data and obviously because the government gets so much information for surveillance, and this government is very surveillance happy government, then it kind of works for both of them,” said Narayan.

When asked whether certain safeguards can be established to protect people’s data, Narayan said, “No, just remove this. There are no safeguards or regulations. If once facial recognition system, camera systems are in place, they [the technology] will be problematic because that’s how the structure is… You’re in a system that does not work for the benefit of the marginalized, the poor. You are in an extractive kind of an apparatus and you’re going to work according to the apparatus. You can’t regulate, comply or audit your way out of it. These systems have to go. We don’t need them.”

Also Read:

STAY ON TOP OF TECH NEWS: Our daily newsletter with the top story of the day from MediaNama, delivered to your inbox before 9 AM. Click here to sign up today!

Written By

I'm interested in the shaping and strengthening of rights in the digital space. I cover cybersecurity, platform regulation, gig worker economy. In my free time, I'm either binge-watching an anime or off on a hike.

Free Reads


The ‘Reforming Intelligence and Securing America Act’ (RISAA) is a legislation to reauthorize Section 702 of the Foreign Intelligence Surveillance Act (FISA).


In its submission, the Interior Ministry said the decision to impose a ban was "made in the interest of upholding national security, maintaining public...


Among other things, the security requirements include data encryption and regular review and updated access permissions to reflect personnel changes.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...


Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...


The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...


Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...


Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ