wordpress blog stats
Connect with us

Hi, what are you looking for?

Deep Dive: A plea against “coerced” FRS attendance, and the harms of workplace surveillance

The proliferation of facial recognition tech (FRT) without oversight or guardrails has major implications for privacy, freedoms, and dignity itself

Imagine the place where you work suddenly announces that staff attendance will be recorded using facial recognition technology (FRT). Not only does the administration mandate this form of attendance, it also declares that salary will be determined based on this system. Will you protest this? For Professor Suvijna Awasthi of Chhatrapati Shahu Ji Maharaj University the answer was “yes.”

Dr. Awasthi’s opposition to such use of her biometric data first came to light in October, 2022 when the Allahabad High Court heard her plea. She accused the new system of violating individuals’ right to privacy under Article 21 (Right to Protection of Life and Liberty). Further, she said there was no legal provision allowing the university to use biometric records for attendance. Of course, neither Dr. Awasthi nor her legal counsel could anticipate the new provisions that were introduced in the Digital Personal Data Protection Bill (DPDP), 2022 on November 18. However, Dr. Awasthi’s pointed rejection of the university order begs the question of whether FRTs are suitable systems for educational institutions or any workplace.

Why it matters: For better or worse, FRT and similar surveillance technologies are becoming popular with government and law enforcement authorities. Even before the aforementioned university, schools in Andhra Pradesh adopted FRT to mark teacher attendance. There too, the move faced friction from teachers. However, this is the first plea that refuses FRT-based attendance on grounds of the right to privacy. Not only does this make the plea a unique case, it also indicates growing awareness among the public regarding data protection and privacy.


FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.


University switches from paper to FRT for attendance

On April 18, the University administration asked its staff to register their attendance using biometrics instruments that were to be installed “in sufficient numbers” at public places. The instruments were to use FRT and thumb recognition to scan and store employee data in a centralized server. As per an order, the salary of all teaching and non-teaching staff would be based upon this attendance.

“Sufficient security attendance [to] be ensured for the safety of the biometrics attendance instrument and it should be further ensured that the instrument and the software was properly functioning and should be subject to regular maintenance on day-to-day basis,” said the order.

This was among the last orders issued by the university before the professor approached the High Court.

Awasthi says biometrics violates personal freedom: In her plea, Dr. Awasthi said she filed a representation against the biometrics-recording registration on March 31. She argued that she was recording her attendance in this manner “with strong dissent” and called the practice a “violation of personal liberty and freedom.”

She also expressed apprehension pertaining to: biometrics data threat, the detrimental effect of biometrics attendance, the agency hired for such purpose, the storage of such date, privacy and security of the employee.

The Vice-Chancellor responded to these concerns by saying that the biometric details would be secured using the latest technology. He also said that the chances of the data being misused ‘did not exist’ and that all central and state government offices and Public Sector Corporations use such means of attendance.

Awasthi’s plea in court for the right to privacy 

University does not have the power to use FRT: Despite the Vice-Chancellor’s response, Dr. Awasthi approached the High Court in July and argued that the university orders were passed in “an arbitrary and discriminatory exercise of powers” violating Article 14 (Right to Equality) of the Constitution. Further, the plea claimed that the orders were passed without jurisdiction and as such are not referable to any provisions of the UP State Universities Act, 1973.

Since the university did not receive any directive from the Ministry of Human Resource Development Department of Education nor the UGC to use biometrics, Awasthi argued that neither the Chancellor not the Vice-Chancellor have any powers to implement such a rule.

“The respondent University neither has the means nor capacity for running the biometrics device at its own level nor is it sufficiently equipped for maintaining and protecting the data so collected,” said the petition.

Staff clueless about service provider and data storage: As per Awasthi’s plea, a private agency was given the responsibility to mark attendance through the biometrics device “so that all the data collected remain in custody of such private agency.” However, despite Awasthi’s best efforts the university did not share the details of the private agency engaged for such purpose. The petition claimed that this violated the right to privacy under Article 21 as well as the freedom of movement as enshrined under Article 19(1)(d) of the Constitution.

“Collection of personal information and maintenance thereof is a central data base poses risk to individual privacy… use of biometrics technology amounts to a wholesale violation of the right to privacy and there does not exist any compelling State interest to justify the same,” said the plea.

Concerns about sharing of sensitive data: The plea noted that specific features like the image of a fingerprint, iris or retina, etc. are collected to build a biometrics template. Awasthi in the petition called this procedure “demeaning as distinct criminal overtones and is highly intrusive.”

Biometrics data can also be considered as sensitive data since it may reveal racial or ethnic origin information, said the plea. It pointed out that the biometrics ID project gathers a vast amount of co-relating personal data, individual habits and behaviour which become increasingly transparent as people continue to be monitored.

“The aforesaid is more worrisome due to potential· for sharing personal data with other organizations, such as, business partners, Corporations and Governments,” said the plea.

Awasthi is not alone in voicing such concerns. Following the April order, the All India Federation of Universities and college teachers organization also filed a representation before the Chancellor praying for revocation of the order. However, the requests have so far been ignored.

Why is FRT being introduced in universities?

Speaking to MediaNama, Karan Tripathi, lawyer and researcher on the criminal justice system at Oxford University, pointed out that universities in India are considered to be political spaces. For this reason, Tripathi said that purpose restriction of using such technologies within university spaces that are considered very political will be very difficult. There are also classic issues regarding FRT such as improper detection of people of marginalized communities or individuals who are not part of the institution when such tools are trained.

“Let’s say someone’s wearing a hijab or someone’s wearing a burqa. Would the use of FRT then prevent such people to wearing hijab/burqa in universities?” Tripathi argued.

Similarly, transgenders and non-binaries are another group of people who are prone to misdetection through FRT systems. Tripathi pointed out that these systems are not trained on the data from such communities.

“They will face recurrent issues with misdetection, especially if detection is linked with incentives that will automatically make them vulnerable to lesser or inadequate delivery of incentives, which becomes a discriminatory issue, which is an indirect discrimination,” he said.

This means that FRT and similar technologies will have a disparate impact by “disproportionately disadvantaging” these communities. Tripathi also mentioned that jurisprudence on Article 14 now also incorporates indirect discrimination and disparate impact.

FRT in university is a political move: The researcher said that the use of FRT in a university cannot be seen as a post-political use of technology. Instead, such usage is “very much political” because universities in India have always been under scrutiny by the government.

According to Tripathi, the very use of FRT in these situations indicates mistrust towards teachers. He called it “a structural mistrust towards educators that they will not turn up for classes.” As such, he stressed for a constant vigilance over classroom decisions.

“How does that mistrust then affect the functioning of teachers and educators and their institutions who are there? Especially in a university set up where there are a lot of students who are taught to have liberated thinking, free thinking, critical thinking. How does constant surveillance of teachers in those setups are then going to affect what we imagine university pedagogy to be, which is liberal, critical, open thinking?” he asked.

Will deemed consent allow surveillance in workplaces?

In her petition, Dr. Awasthi said that she never gave her consent to the administration to record/ register her biometrics. The plea said, “all such measures had been unilaterally forcing under coercion. The data so collected was without consent of the petitioner.”

Typically, consent of an individual for processing their data is considered crucial. Previous version of India’s data protection Bill upheld this value and in doing so, gave some leeway to Awasthi’s argument. However, data processing for employment purposes has been put under the ambit of deemed consent by the Digital Personal Data Protection Bill, 2022 (DPDP Bill), the latest iteration of the Bill.

What is deemed consent? As per the DPDP Bill, deemed consent is when a Data Principal is “deemed” to have consented for the processing of their personal data.

As per Clauses 7(7) of the DPDP Bill, “for the purposes related to employment, including prevention of corporate espionage, maintenance of confidentiality of trade secrets, intellectual property, classified information, recruitment, termination of employment, provision of any service or benefit sought by a Data Principal who is an employee, verification of attendance and assessment of performance.”

Expert raises concerns about function creep: Speaking to MediaNama, Professor Anupam Guha from the Ashank Desai Centre for Policy Studies, IIT Bombay, particularly criticised the “assessment of performance” mentioned in the clause. He pointed out that the unclear phrasing could mean anything.

“You could make a case in the future that putting a tracker on the phone of the teacher teaching in a school is also a valid use case. So, that is the first problem that once you start facial recognition in a workplace, you are sort of enabling a culture of surveillance without any safeguards,” said Guha.

The professor also talked about the possibility of ‘function creep’ where information collected for one purpose is then used other purposes as well.

“So for example, if they are doing FRT in this [university] for attendance, how do we know that that data is secure and is not going to be used for any other purpose in the future?” he asked.

Even stakeholders attending MediaNama’sReworking the Data Protection Bill’ event worried about how this clause could lead to a power asymmetry in an employer-employee relationship. While some pointed out that consent can never be free in an employment situation, most agreed that the purposes mentioned in the DPDP Bill are a lot more open-ended as compared to previous versions.

Tripathi too drew parallels between the criticism for deemed consent and the use of FRT for employment in educational institutions. He said that in a structurally hierarchical society where the power asymmetries are very big, deemed consent further widens those inequalities and further increases vulnerabilities of people who find themselves at the lower rungs of those power structures. The same can be said for FRT in workplaces.

FRT can hinder right to dignified life: Speaking on the Kanpur university case, Guha agreed that there are no legal provisions to prevent the administration from using FRT. However, he claimed the move violates the Puttaswamy judgment. He said that a case can be made that monitoring of people by FRT is fundamentally violative of the right to life with dignity.

Similarly, Tripathi said that the use of FRT and other technologies amplifies the magnitude and nature of data collection by universities in the absence of any legal regulation. As such the technology will be used in “legally ambiguous space.”

Surveillance does not ensure productivity: Meanwhile, Guha talked about how surveillance within workplaces can impact the economic rights of citizens. He pointed out that there is no study to show that workplace surveillance increases productivity. Instead, such surveillance is said to have a chilling effect on employees that dampens their agency in the workplace. This makes it easy to exploit and harass employees.

“If you combine that with the recent dilution of labour laws and labour code during the pandemic, that creates an overall ecosystem where workers are basically harassed more, observed, surveilled, and their behaviour is unconsciously moulded by their workplace,” said Guha.

He warned that this can lead to depression of wages and a normalization of an invasive workplace culture. Stating that this is generally bad for India’s overall economics, he stressed that FRT at workplaces should not be viewed in an isolated manner.

FRT presence grows in public spaces

Already, FRT is used in airports under the Digi Yatra scheme. Niti Aayog, the government’s think tank released a report on the ‘responsible use’ of FRT, indicating a growing government interest in such surveillance technology. Guha even pointed out how many places are already using FRT inside and outside workplaces, offices, etc. Law enforcement agencies are also taking an interest in FRT.

The most recent example was seen on December 8 when Chennai police allegedly stopped innocent passers-by at night and photographed them. Queries on Twitter revealed that the police were running people’s faces through an FRT system. Even unassuming spaces like a Royal Challengers Bangalore (RCB) Bar and Café has started using FRT to monitor customers. Worse still, those aware about the FRT did not seem bothered about the collection and storage of such vast data.

“This is bad because once you have this ecosystem of FRT you are sort of normalizing large collections of personal data and normalizing cross use. Once you collect all this facial data it will be used at other places,” said Guha.

He pointed out that while such moves may not seem malicious on the surface, such surveillance eventually harms employees. So, he discouraged the idea of workplace surveillance and said that policies encouraging FRT for small use cases are bound to leak into a much wider culture of workplace services.


This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Also Read:

Written By

I'm interested in the shaping and strengthening of rights in the digital space. I cover cybersecurity, platform regulation, gig worker economy. In my free time, I'm either binge-watching an anime or off on a hike.

Free Reads

News

The ‘Reforming Intelligence and Securing America Act’ (RISAA) is a legislation to reauthorize Section 702 of the Foreign Intelligence Surveillance Act (FISA).

News

In its submission, the Interior Ministry said the decision to impose a ban was "made in the interest of upholding national security, maintaining public...

News

Among other things, the security requirements include data encryption and regular review and updated access permissions to reflect personnel changes.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...

News

Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...

News

The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...

News

Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...

News

Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ