This is the first of a four-part series. You can read the other stories here.
You are reading it here first: In Lucknow, Uttar Pradesh, if you are ‘moving near public toilets’, playing cards, or smoking in a public place with ‘women-heavy footfall’, or if you are someone who is ‘lurking’ at public spaces, you should expect to be tracked using 700 artificial intelligence-based CCTV cameras.
These are among the around 40, often vague, scenarios mentioned by the Uttar Pradesh government that will be detected as ‘suspicious activities’ by AI-based models deployed as part of the upcoming Lucknow Safe City Project. If that was not alarming enough, the UP government will also be installing 100 facial recognition cameras and setting up a system that is capable of —
- Storing facial data of 20,000 individuals in real-time
- Support manual search capabilities of a database comprising 1 lakh faces
- Labeling faces as ‘blacklisted’ and generate alerts based on it
Essentially, these proposals, which are a part of a tender for appointing a system integrator for the Safe City Project, mark the first step in the transformation of Lucknow into a police city (as an extension of a Police State) by the UP government, in the garb of “fighting crime against women”.
With queries in this regard, MediaNama reached out to the Home Secretary of the Uttar Pradesh government, who, according to the Union Ministry of Home Affairs website is the nodal officer for the project in Lucknow. On August 16, we also reached out to the UP Police PRO and Neera Rawat, the Additional Director General of Police in the Women Power Line 1090 (WPL1090) section of the UP Police, who is spearheading the project in the police department. We were later informed by those in her office that she won’t speak to us. [We have attached our queries regarding this at the end of the report]
Background: The Ministry of Women and Child Development in collaboration with the Ministry of Home Affairs launched Safe City projects in eight (8) pilot cities with claims that it would promote the safety and security of women. The Women and Child Development Ministry approved this initiative for Lucknow, the tender said. The implementation of the Safe City Project in Lucknow, is centrally sponsored with 60:40 cost sharing between Government of India and Government of Uttar Pradesh, at a cost of Rs 194 crore. In October 2020, UP governor Anandiben Patel inaugurated the Safe City Project by flagging off 100 pink scooties and 10 four-wheeler police vehicles.
Why it matters: Overall, these applications of artificial intelligence and facial recognition paints a grim picture of the present state of surveillance in India and where it’s headed in the absence of robust data privacy laws. However, even if the current form of the Personal Data Protection Bill is adopted as a law in the future, its various clauses say that government agencies, in this case, the UP government, would be exempted from its provisions. Thus, the various checks and balances in the PDP Bill to regulate the usage of “sensitive personal data” such as facial recognition in the bill, will not be applicable here.
UP government lists out 40, some ridiculous, scenarios for usage of AI-based CCTV cameras
The tender gives around 40 situations that the 700 AI-equipped CCTV cameras would detect. We are publishing these use cases with minimal edits to reflect the vagueness of some of the examples given.
- Detecting violence against women
- Identifying a woman’s call for help such as waving hand towards a CCTV camera
- Record number plate of vehicles which are involved in chain, purse or ornament snatching
- Identifying group of smokers at public places such as areas with heavy footfall of women
- Identifying a group of drinkers in public places and outside wine shops
- Detect if “Alcohol bottle is visible”
- “Lane cutting zigzag on two-wheeler/ four wheeler”
- Stalking/following women in public places/isolated areas
- Stalking/following women in isolated areas during the night
- The camera will keep an eye on traffic violations that sometimes lead to crimes such as kidnapping*
- Detect those who frequent a particular location such as near school, colleges etc
- Identifying gambling spots in public areas/ gambling streets
- Detect those who have playing cards visible*
- Pushing women who are standing in a queue at bus stops
- “Detect men moving near ladies public toilets”*
- Face recognition of registered sexual offenders from national database
- Alerts of sexual offenders who are outside jail at bail etc
- Detecting behavior of boys and men outside garment shops for women*
- “Identifying slapping/ hitting a woman in public places/ isolated areas”
- “Identifying throwing ink and blackening face of a women”
- “Identifying hair pulling”
- Detecting a stranded girl, or “abnormal behavior” of a girl child at bus station/railway station
- “Traffic signal”
- “Face recognition with missing person database”
- Identifying pickpockets, purse snatching and theft at the exit of a mandir and entry points
- Detection of accidents
- Detecting weapons like like firearms, knife etc
- Identifying “unconscious woman/ stranded body” during the night
- Pushing/ inappropriate touching/stalking women in parks and gardens
- “Behavioural analysis of boys/ men at shadow areas”
- Identifying open defecation
- Identifying road blocks by vehicles or by group of people
- Identifying people “lurking” at public places/ Isolated areas*
- Identifying “tendency of suicide attempt at bridges, pulls, railway tracks”
- “Identifying kidnapping by waving hands with fast movements from auto/ four-wheeler”*
- “Faster/ abnormal running motion behind woman”*
- Identifying foeticide/infanticide by detecting unattended bag or suspicious object near nalas, rivers or hospitals
- Detecting people at women-footfall-heavy areas like rallies
- “Group of bikers”*
MediaNama’s work of covering the key policy themes that are shaping the future of the Indian Internet is made possible by support from its subscribers. If developments in technology policy are key to you or your organisation, we urge you to subscribe to MediaNama to support our journalism.
Please subscribe here.
Although the UP government attempts to be specific by listing various situations, it ends up being vague in many cases. For instance,
- “Behavioural analysis of boys/men at shadow areas”: Will every man and boy in “shadows” be surveilled by the UP government? It is also important to note that by differentiating between boys and men, it can be assumed that the UP Police is talking about surveilling adults and minors.
- Identifying people lurking at public places/ Isolated areas: How does the UP government define lurking? Unlike stalking, lurking is not a punishable offense in the country.
- “Identifying kidnapping by waving hands with fast movements from auto/ four-wheeler” – A person may wave their hands from an auto/four-wheeler at someone with the intention of bidding bye, or may just well be trying to catch an acquaintance’s attention. Is that also going to be presumed as kidnapping by the AI-equipped CCTV?
Lawyer and researcher Vidushi Marda added to this by pointing out that vague phrasing of the scenarios may lead to mass surveillance that may go beyond the ambit of stopping crimes against women.
While the tender may be issued with the aim of protecting women, the impact of it will be a large scale, mass surveillance exercise that involves open ended data collection with little to no purpose limitation. For instance, “male movement near ladies’ toilets” and “Boys/ Men behaviour outside ladies market like Garments shop” seem to me like they are loosely defined to the point of losing all meaning. How will that work in practice? – Vidushi Marda, lawyer and researcher
Secondly, as Shweta Mohandas, a policy officer at Centre for Internet and Society points out, “One of the causes for concern of using video analytics and facial recognition for the instances mentioned is that there is no documented proof that is publicly available that shows that the use of this technology can help with the specific issues of safety in the Indian context.”
Facial recognition system to generate alerts based on labels assigned to faces
A major component of the Safe City Project is facial recognition. The idea of it was first mentioned in January this year, by UP’s additional director general of police (law and order) Prashant Kumar, who was quoted by The Wire as saying, “cameras will be able to detect any change in the facial expressions of a woman being subjected to stalking, threats or harassment on the streets, and an alert will be sent to the police control room.”
Although the tender does not mention any such feature, it does propose that labels “such as ‘employee’, ‘blacklisted’ etc” to be assigned to faces. An alert-generating system based on these particular labels should also be there, the tender added. “(The system) should generate an alert if somebody is detected in an area where he/she is not permitted,” the tender.
This requirement of labeling faces of the Uttar Pradesh government is highly concerning and experts pointed out that it can be misused.
What is the need for this labelling? What is the criteria behind the labeling? If an individual is labelled as a ‘terrorist’ what happens then? Is the labelling permanent? Can a person approach the government to remove such a label? How long will that label be there? — Pallavi Bedi, senior policy officer, Centre for Internet and Society
Marda pointed out that this was not the first time police in India had ‘labelled’ or ‘classified’ people using facial recognition. She said that this was done by the Chennai Police in 2018, and its application can also be found in the Ministry of Home Affairs’ claim that Delhi Police’s usage of facial recognition helped the police catch over 1,000 rioters.
When technology is used to create new categories of people, not based on legal standards but on unilateral categorisations that are made in the absence of accountability mechanisms, the scope for misuse is tremendous. The act of creating categories or putting together lists is fundamentally problematic. Further, accuracy of face recognition systems in real-world applications are abysmally low, which leads to another host of complications. Finally, who decides which category people belong to? What if people belong to more than one category (as more often than not, we all do!) – Vidushi Marda, lawyer and researcher
These are the other demands made by the UP government of the prospective bidder in terms of features needed for the facial recognition system
- Adding new faces for recognition: The tender said that the system should support the uploading of images of an individual’s face. “A new face may also be added by selecting images of the individual’s face from the list of faces detected by the system. The system should allow for the addition of photographs of criminals obtained from newspapers, etc.,” it said, adding that the system should support recognition of faces that are partially identifiable.
- Matching faces from pre-recorded feeds/database: The tender demanded that the system should be able to match a “suspected criminal’s face’ from pre-recorded private or public feeds. It should also be able to identify faces from videos stored in .avi, .mp4, and other formats.
- Assigning labels to faces: The tender also talks about assigning labels to faces “such as ‘employee’, ‘blacklisted’ etc.” An alert system based on these particular labels should be made functional, the tender added. “(The system) should generate an alert if somebody is detected in an area where he/she is not permitted,” the tender.
- Mobile app: The tender said that the selected vendor will have to develop a mobile app that supports iOS and Android. The mobile application will be used in capturing “face of suspect in field and sending them back to FRS server for matching”. The results of the match will be returned back to the mobile app, it added.
- Searching a face in the database: The tender said that a face should be searchable
- From the main library and ‘the blacklist’ library
- By time period
- By structural information such as age, gender, etc.
- Results of the search should throw up original pictures of the person, and the person’s details. The cops should also be able to search for a face by doing an image search.
It is important to note that, after the alerts are raised based on AI-based video analytics and facial recognition, an incident management operator at the command and control center of the Safe City Project will have to validate the alerts by verifying image and video associated with the alert, the tender. Only then can the matter, if any, be escalated.
Prasanth Sugathan, Legal Director at Software Freedom Law Center said, “In the absence of any procedural safeguards, this will become highly problematic. Past research undertaken on efficacy of AI software have showed that they are prone to failure, and that it often detected incorrect emotions and identified wrong people.”
Reasonable expectation of privacy means that one can expect privacy in reasonable places. For instance, one can expect privacy more at their home than in a public place. Despite, the videos being recorded from a place from where one cannot expect privacy, one needs to take into consideration that this data which is being collected can be used in various ways such as profiling based on religious identity — Nikhil Narendran, a partner with the law firm Trilegal
Not possible to monitor each CCTV manually: UP government
The UP government said that artificial intelligence was being introduced in CCTVs of Lucknow because there are too many surveillance cameras and that it was not possible to monitor each throughout the day.
Since events are more likely to occur while the operator is not watching, many significant events go undetected, even when they are recorded. Operators can’t be expected to trace through hours of video footage, especially if they’re not sure what they’re looking for. To address that, the solution shall have a AI based video analytics system, coupled with alert engine, which will utilize deep learning algorithms to provide the necessary analytics and proactive alerts to the stakeholders — Lucknow Safe City Project tender
How will the AI work?: The UP government seems convinced that the application of artificial intelligence and machine learning/deep learning, will help in detecting the above-mentioned situations.
- The tender says that self-learning algorithms that use datasets to “identify suspicious people/objects based on their behaviours” will be deployed. The UP government does not mention what kind of datasets will be used for the same.
- The system will also have a “continuous learning capability” which would help in updating the existing data models and creating new data models for use in video analytics/event detection, the tender says.
- The algorithm should also be able to estimate a person’s gender, age, “real-time identification” and so on.
How will the facial recognition system work? The tender said that the facial recognition algorithm quoted by the bidder should have taken part in the National Institute of Standards and Technology’s Face Recognition Vendor Test (FRVT) in the United States of America. The algorithm will be used in around 100 CCTV cameras, and will also be integrated with the government’s database used by law enforcement such as CCTNS, prisons, courts, and so on. By integrating these databases, the algorithm will mine data and the tender said, it would —
- Match the photograph of a criminal/suspect or any person involved in crime against women with these databases.
- “Match a suspect’s face with video feeds of specific camera locations or with the feed received from private or other public organization’s video feeds,” the tender added. (emphasis supplied)
What we asked UP government regarding usage of AI video analytics and facial recognition
1. Few of the over 40 scenarios given in the tender for the usage of AI-based video analytics such as
- “Behavioural analysis of boys/men at shadow areas”
- Identifying people lurking at public places/ Isolated areas
and so on paint a very vague picture of where and why it is going to be used. For instance, ‘lurking’ unlike ‘stalking’ is not a punishable offence. In that context, do you think that the ambit of surveillance that is proposed may well go beyond the aspect of protecting women?
2. A major aspect of the Safe City Project in Lucknow is that of facial recognition. One of the requirements made in the tender is that of assigning labels to faces and that the system should generate alerts based on the levels. The examples of labels given are ’employee’, ‘blacklisted’ etc.
- What are the parameters of assigning labels to the face of an individual?
- Will an individual be informed that their face has been labelled in the Safe City system?
- Is there any provision for the concerned person to approach the government for deleting the said label?
- How do you address concerns of the possible misuse of such labeling?
3. Please also provide information regarding whether any privacy impact assessment was done before going ahead with the proposal of using facial recognition technology in the Safe City Project.
- Exclusive: Major entrance examinations such as JEE, NEET, UGC-NET to come under facial recognition surveillance
- 1 lakh CCTVs at 4,000 centers with facial recognition: National Testing Agency expands surveillance of JEE, NEET exams
- MoHFW proposes facial recognition for verification of candidates sitting for exams conducted by AIIMS
Have something to add? Subscribe to MediaNama here and post your comment.