wordpress blog stats
Connect with us

Hi, what are you looking for?

Facial Recognition Cameras Active At Bengaluru’s RCB Bar & Café: The Harms of India’s Tryst with FRT

Experts say facial recognition tech should be ring-fenced and limited to specific cases, adding that its human rights impacts should be considered

At the entrance of the Royal Challengers Bangalore Bar & Café (RCB Bar & Café)—located at the foot of Bengaluru’s plush Church Street—hangs a small Dahua infrared security camera. Every person pausing at the manager’s podium before entering the bar is automatically scanned by this relic of pandemic surveillance. 

The monitor immediately displays an image of them, alongside information on their estimated age bracket, gender, and whether they’re wearing a mask or not. The other side of the stream displays a live video stream of the customers approaching the podium, capturing their body temperature through what looks like a thermal map. 

(L) the white Dahua camera system hanging by a rod from the ceiling; (R) The monitor displaying the results of the camera’s findings. Click here for the original photos. | By Aarathi Ganesan.

Nowadays, around 100 patrons step in and out of this Indian Premier League team’s flagship bar every day, Operations Manager Ashish told me when I visited the bar last month. All of them are scanned by the Chinese camera.

As Ashish tells me this, I recall attending an event at the RCB Bar & Café a few months ago. I had unthinkingly watched the camera scan me then. As I speak with him in November, leaning against the podium after a long day, it scans me once again.

Why it matters: Facial recognition technologies (FRT) like this one collect sensitive data on customers—like information on gender, age ranges, and facial images. Additionally, these systems are also known for faulty readings—which if relied on by law enforcement agencies can wrongly place people behind bars. Laws ensuring that companies use this tech in a limited and responsible privacy-preserving way are needed. But, India doesn’t have laws that protect privacy or regulate how FRT is used. Incoming draft privacy laws do not adequately protect users from these harms, as they also embolden government access to privately collected data, experts informed MediaNama. So, in the meanwhile customers are simply left dependent on the many companies using FRT—like the RCB Bar & Café—to protect their data securely, and prevent life-altering harms from taking place. An immediate solution to protecting privacy in the wake of expansive laws could simply be using less-invasive technology wherever possible.

Is FRT popular in India?: The RCB Bar & Café is one among many Indian companies and organisations utilising FRT for ‘security’ processes, among others. Chaayos outlets in Delhi used it to register and identify customers dropping in for a cup of tea. Kanpur University uses the technology to mark attendance. The Indian government is using it to power Digi Yatra, its contactless system for boarding flights across the country. Police forces across the country lean on FRT to identify “criminals”. Indian governments use it in the hope of making cities “smart” and safe. 

Advertisement. Scroll to continue reading.

The technology is used so ubiquitously because there exists no law to actually regulate it, experts unanimously told MediaNama

“Think of it from a company’s perspective,” says Namrata Maheshwari, Asia Pacific Policy Counsel at Access Now. “Unless they face real prospects of consequences [for privacy harms through their processing], they’re unlikely to be deterred from using this technology. So, company justifications to use FRT widely are currently made in the absence of a framework that requires them to think about whether their use of these technologies is necessary or proportionate [to their stated aims].”

FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.

Why is this system used—and how well does the RCB Bar & Café protect the data it collects?

“We introduced this at the RCB Bar & Café when the pandemic hit, and temperature and mask checks were paramount to all entries into the public premises by the state authority,” explains Rajesh Menon, Vice President and Head of Royal Challengers Bangalore in conversation with MediaNama. “It was important for us to cross check the temperature data monitored by handheld sensors because those were not measuring the most accurate and precise reading.”

“We adhere to all the Government Laws [sic] in India,” added Menon when probed on specific privacy-protecting laws, rules, and regulations the RCB Bar abides by to ensure the customer’s data is secured. 

Advertisement. Scroll to continue reading.

However, a cornerstone of privacy law is obtaining a customer’s informed and explicit consent to intrude on their privacy. This is when a person is first briefed on how their information will be collected by a company—then, on the basis of this information, they accept the company’s request to collect information on them. 

Customer consent to being recorded by the Dahua system doesn’t appear to be explicit, says Pallavi Bedi, Senior Policy Officer at the Centre for Internet and Society. “A customer may enter the restaurant without being aware that their picture is being taken, or that it is being stored. They may not know how it’ll be used,” she says. 

Maheshwari agrees, adding that “consent that is truly privacy respecting is informed, voluntary, free, and explicit. In this case, it’s not clear if customers are informed about the system, or given a real choice to consent to this sort of data collection, processing or storage.”

According to Ameen Jauhar, Senior Resident Fellow at the Vidhi Centre for Legal Policy, this dilution of obtaining consent is a larger byproduct of the widespread use of AI systems like FRT. 

Advertisement. Scroll to continue reading.

“Consent is diluted especially when you bring in such systems, say a smart doorbell that scans every guest coming into your house without her consent, or a face scan in a restaurant determining whether a guest is masked or not. Every person may not have the opportunity to give meaningful consent,” he says. “Instead in a lot of these cases, accessing such a space is deemed as consent, which is contentious. The counterpoint is that there might be a legitimate security interest to use these technologies. Whether that interest exists or not will have to be determined on a case by case basis, taking into consideration what kinds of data are collected, how it is processed, where it is stored, and for how long.”

These factual questions can help decide whether the company’s interest in using FRT is proportionate to the privacy violations taking place. 

The collected data—on masks and temperature—is only stored on a physical server at the restaurant premises for three weeks, after which it is automatically deleted, clarifies RCB’s Menon. This information is not used for any other purposes—like profiling—nor is it cross-linked against other personal information collected by the restaurant’s staff, he claims. 

“Only name and number is requested [by the staff separately] at the reception, if at all, to share event and promotion communication so the guests are updated with all things happening at RCB Bar & Café,” says Menon, adding that customer consent is obtained in such cases. “If a guest doesn’t want to share the same, we completely respect the choice.” 

The Dahua system only checks temperature so there is no purpose that’s linked with data collection,” he concludes.

Menon believes that as the RCB Bar & Café does not collect any personally identifiable data without its guest’s consent, it has not received any complaints so far.  Ashish agrees, adding that as far as he can recall, customers don’t refuse to be scanned by the system. They don’t seem to particularly care about its use, he observes. Now that the pandemic is subsiding, it’s not used as much for temperature checking, Ashish tells me. This is just the system the RCB Bar & Café is sticking with now.

Advertisement. Scroll to continue reading.

How do India’s data protection laws treat ‘implicit’ consent—and how does this impact privacy?

India’s own recently-released draft data protection law may crystallise this ‘implicit’ customer consent to personal data collection. 

The Bill contains provisions for “deemed consent”—which is when a person is deemed to have consented to personal data processing if they’ve voluntarily disclosed their personal data to a company, and it is “reasonably expected” that they would anyway provide this information to the company. Somewhat fatefully, the law explains this provision using the example of a restaurant: 

“‘A’ shares her name and mobile number with a Data Fiduciary for the purpose of reserving a table at a restaurant. ‘A’ shall be deemed to have given her consent to the collection of her name and mobile number by the Data Fiduciary for the purpose of confirming the reservation.” — the draft Digital Personal Data Bill, 2022.

“If a customer enters a restaurant [whether RCB’s or otherwise], and the camera is within sight but they don’t really understand the real scope of how it is used, is that deemed consent under the draft law?” asks Maheshwari. “Under the new law, one might argue that it could be—which is problematic because the provision on deemed consent in the new law is too broad and vague, and it also does not impose added obligations or responsibilities on companies to collect sensitive data, like biometric information, in a specific [privacy-protecting] way.”

What if the RCB Bar & Cafe’s data is requested by law enforcement agencies?

Over 28 CP Plus CCTV cameras are also installed across the RCB Bar & Café premises, Menon says. Ashish observes that the Dahua system further aids the Bar’s security interests. He hypothesises that if a scuffle breaks out, the system can help identify who participated in it. It may also be helpful for the police while investigating such issues, he adds, considering another imagined scenario. The Bar also hosts VIP events for the Royal Challengers Bangalore IPL franchise, so protecting attendees’ security throughout is important. 

Advertisement. Scroll to continue reading.

“Till date, no law enforcement agency has needed to request data from our premises,” clarifies Menon. “However, should a situation arise where the data is required as evidence in a court of law, our data can be produced but only if it is transferred on the same day of incidence as the data is periodically auto deleted from the system in 3 weeks’ time.”

Law enforcement agencies can already access specific data for investigations under provisions of the Code of Criminal Procedure, and rules under the Information and Technology Act, 2000, experts informed MediaNama. However, the draft law also widens the government’s powers to access the data collected by private entities—broadening the scope of surveillance, and the use of FRT data, beyond just a restaurant or company’s limited premises. 

For example, it grants the Indian government and its agencies many exceptions where it need not follow the privacy law’s provisions. These exemptions can be justified on broad grounds, such as “in the interests of sovereignty and integrity of India, security of the State, friendly relations with foreign States, maintenance of public order or preventing incitement to any cognizable offence relating to any of these”. The government can also exempt a type of “data fiduciary”, or company, from following the law’s privacy-protecting clauses. 

“The draft law grants exemptions to both government agencies and private entities,” explains Maheshwari. “The government’s powers are now so broad that it can potentially access any kind of data on wide grounds that can be invoked in practically most scenarios. There is no independent oversight mechanism in the law to ensure that the provisions aren’t being invoked too liberally.”

The bottom line is that the law already grants law enforcement agencies fairly wide powers to access data from private entities. “Under the draft data protection law, these powers are only likely to be reinforced instead of limited,” Jauhar concludes. For FRT-related data specifically, this opens up a pandora’s box of harms.

When and how does FRT harm ordinary people?

Advertisement. Scroll to continue reading.

“The Dahua system doesn’t always capture the gender correctly,” Ashish casually tells me during our conversation. During my two visits to the RCB Bar & Café, staff were present at the registration desk to welcome and view the customers in-person. 

So, in this limited case, incorrect gender readings need not matter very much—as the RCB Bar & Café claims that the system is only used to check customer temperature before allowing entry into the premises. 

But, in other situations, poor readings can have serious consequences. If erroneous information is derived by the system and stored in a database, it can trigger a domino effect of incorrect information being used by government agencies. 

“FRT systems have been critiqued for not analysing images correctly,” Bedi explains. This can be because of a training bias—the algorithms are fed with millions of images which they use to learn how to recognise different physical attributes. Sometimes, they’re trained largely using images of white faces, instead of black or brown faces. As a result, they can’t interpret faces of colour correctly and arrive at biased, incorrect conclusions that harm these groups, especially during legal investigations. Sometimes it’s a question of how the picture is taken—the lighting, angle, and background also help determine accurate inferences. 

The harms of inaccurate outcomes particularly arise if this information is used in law enforcement contexts, says Jauhar. “There are two possible results: false negatives and false positives. An example of a false negative is when the machine doesn’t recognise that someone is wearing a mask, even though they are. A false positive is when a system identifies a person as someone they’re not. They could be matched with a person of interest, a criminal, or terrorist [and then fall under a law enforcement agency’s scanner]. Both cases can impact someone’s individual life, liberty, and freedom in varying degrees.”

For Maheshwari, the very existence of these digital eyes across India may cause people to instinctively err on the side of caution. 

Advertisement. Scroll to continue reading.

“At the deployment level, most people don’t know when they’re being surveilled,” says Maheshwari. “But, they’re also reading constantly that these technologies are being deployed across public spaces. They may end up assuming that they are being surveilled—which automatically impacts their privacy, and their ability to speak and express themselves freely. A kind of self-censorship in public spaces is produced because of how pervasive these technologies are.”

Aside from individual concerns, the lack of a privacy law imposing responsibilities to collect data sensibly also raises larger social harms. “The harms [of insufficiently protected data] can ultimately go up to the level of affecting national security,” Maheshwari surmises. 

Are there national security concerns to using Chinese surveillance equipment?

The RCB Bar & Café isn’t the only entity in India using a Chinese surveillance system. Dahua, and many other Chinese-owned and manufactured surveillance systems companies, appear to enjoy a sizable customer base in the country. 

The government has also purchased around 10 lakh Chinese CCTV cameras, which are installed across India’s government institutions.

However, over in the United States, future imports of Dahua’s video surveillance equipment were recently banned. According to the Department of Commerce, its gear poses “unacceptable risks” to the United States’ national security, as does surveillance equipment manufactured by other Chinese companies like Huawei, Hikvision, Hytera, and ZTE. The United Kingdom also announced similar measures recently, which prevent government authorities from installing technology produced by companies subjected to China’s National Intelligence Law.

Advertisement. Scroll to continue reading.

“China’s National Intelligence Law, passed in 2017, essentially authorises the government to demand information from private security on national security grounds,” explains Manoj Kewalramani, Fellow-China Studies and the Chairperson of the Indo-Pacific Studies Programme at the Takshashila Institution. “Countries [buying Chinese surveillance equipment] end up deciding to keep it out of their sensitive domains as a result of this.” 

The Indian government is aware of these concerns—with ministers acknowledging in Parliament the vulnerabilities of video data captured by CCTV cameras being transferred to foreign servers. The government also introduced restrictions on the use of Chinese tech in the telecom sector a few years ago following Indo-Chinese border skirmishes. 

However, in the absence of a national cybersecurity framework that can deal with these concerns, the government’s larger approach to the national security concerns raised by China has been to ban specific Chinese apps and services. 

Banning Chinese surveillance gear is “meaningless” though as India has financial limitations too, argues Kewalramani. Security risks also depend on where and how the data is stored. “I think these systems could be kept out of sensitive [Central government] areas. But for that, you first have to define parameters for critical or sensitive areas,” he says. “We also need a law that provides clarity on data storage, access, and security, so the company can actually implement these practices. That can actually place a responsibility on companies [to use these systems carefully]. It may also open the doors for a new data audit industry.”

Is this a Church Street-specific problem—or a larger issue with how India approaches emerging technologies?

Given these many harms, the RCB Bar & Café could and should consider using a less intrusive, camera-less temperature-checking system—like a simple handheld thermometer. If it continues to rely on the FRT system, then best principles demand that customers are informed beforehand about the system’s harms, as well as why they are being recorded by the Dahua system, how the data will be used, and where the data will be stored. Only when they consent to these conditions should they have to appear before the Dahua camera. Less invasive solutions are especially needed given that COVID-19 cases are declining—and that the restaurant already has over 28 cameras watching for security issues. 

Advertisement. Scroll to continue reading.

In the meanwhile, developing regulation that guards against the wanton use of FRT across India is the call of the hour.

Jauhar doesn’t believe that a law regulating FRT—or the artificial intelligence empowering it—should necessarily be grouped with privacy concerns. “There are overlaps between surveillance and privacy. But, an algorithm’s biases when predicting something have nothing to do with privacy,” he says. “Instead, a data protection law can address how sensitive data like biometrics is to be collected and processed, but other design flaws or accuracy risks affiliated with an AI system must be addressed in a separate law or governance framework.”

“A law ring-fencing FRT should ideally limit its use to very specific cases,” suggests Bedi. “It can’t be used for mass targeting.” 

Before we get to this stage, Maheshwari recommends fuller investigations of the pros and cons of FRT systems. “We need to look at their human rights impacts. Once we shift the lens here, we’ll get a better idea of how to mitigate these harms and rights violations,” she says. “We also need an independent regulator who ensures that proper checks and balances are in place against the public or private use of surveillance technology.” 

The proportionality test also needs to be defined, argued Jauhar. “We need a precise benchmark that both sets out the scope of usage of FRT, and ensures that that standard is not breached,” he says. “That scope creep, in my opinion, is one of the reasons there is a trust deficit when it comes to FRT.” Also, as much as possible, FRT also needs to be deployed voluntarily, he adds. “Human alternatives performing the same functions should be available. They should be legislatively prescribed to do this work competently and diligently, and to avoid replicating the system’s biases. Effective redressal systems for aggrieved Indians [at the receiving end of FRT] also need to be set up. If things go wrong, people shouldn’t be left helpless.”

This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original. Note: this article was updated on 19/12/2022 at 1:45 pm to address clarity issues. 

Advertisement. Scroll to continue reading.

Read More

Written By

I'm interested in stories that explore how countries use the law to govern technology—and what this tells us about how they perceive tech and its impacts on society. To chat, for feedback, or to leave a tip: aarathi@medianama.com

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



India's smartphone operating system BharOS has received much buzz in the media lately, but does it really merit this attention?


After using the Mapples app as his default navigation app for a week, Sarvesh draws a comparison between Google Maps and Mapples


In the case of the ‘deemed consent' provision in the draft data protection law, brevity comes at the cost of clarity and user protection


The regulatory ambivalence around an instrument so essential to facilitate data exchange – the CM framework – is disconcerting for several reasons.


The provisions around grievance redressal in the Data Protection Bill "stands to be dangerously sparse and nugatory on various counts."

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ