Despite the UN Human Rights Office recently calling for a moratorium on the deployment of facial recognition tech in public spaces, Moscow has deployed a facial-ID based payments system for its metro. The system, called ‘Face Pay’, went live across 240 metro stations in the Russian capital on October 15, according to The Guardian.
Government officials claimed that the system is the ‘largest use of FRT’ in the world even as Russian privacy campaigners raised surveillance concerns. While the government says that the photographs collected will be encrypted, activists believe the data could end up in the hands of Russian security services either way.
Read: Data Leaks – Trading Internal Control For External Vulnerability: Russian Edition
How will this system work?
According to the report, commuters on the metro would have to get their faces scanned by cameras installed above the turnstiles at the station. For this to work, commuters would have to connect their metro cards to their photograph and bank account on Mosmetro’s mobile app.
Moscow’s department of information technology says that the photos collected through the FRT system will be stored in an encyrpted GIS ETSHD system (Moscow’s Unified Data Storage and Processing Center) and that this will not be shared with police authorities. The system is not mandatory, although Moscow authorities expect a 10-15% adoption rate among metro riders in the next two to three years.
Whats the problem with FRT?
Moscow already has a 1,00,000 CCTV-strong surveillance system in the city, and according to reports, it has previously used such systems to crack down on protestors. “This is a dangerous new step in Russia’s push for control over its population.” Stanislav Shakirov, the founder of Russia-based digital rights group Roskomsvoboda, told The Guardian.
In September, a UN report had flagged the use of AI for remote biometric recognition saying that it interferes with the fundamental rights to privacy, freedoms of movement and expression.
The power of AI to serve people is undeniable, but so is AI’s ability to feed human rights violations at an enormous scale with virtually no visibility. Action is needed now to put human rights guardrails on the use of AI, for the good of all of us — UN High Commissioner for Human Rights Michelle Bachelet had said while releasing the report.
At PrivacyNama 2021, Jhalak Kakkar of CCG-NLU Delhi had said that FRT-systems can perpetuate bias and discrimination. “We use bias training data sets, which overall under represents minority communities and then AI systems make decisions based on their training on these data sets which sort of perpetuate and really embed historical bias and discrimination into our society,” Kakkar said.
Expressing concern that AI would reinforce debunked ways of thinking, Mark Andrejevic, a professor at Monash University, said, “I think it’s going to be very important to think about these inferential uses and how they can be used for new forms of social sorting and discrimination.”
Facial recognition at Indian airports and railway stations
Use at airports: In 2018, the Indian government launched DigiYatra, a FRT-based system for automatic processing of passengers’ identity at airport check points. In June, the Ministry of Civil Aviation revealed in the Lok Sabha that the system was under trial at 6 airports in the country: Bengaluru, Hyderabad, Kolkata, Pune, Varanasi, and Vijayawada. Successful implementation there would lead to DigiYatra’s expansion to all airports across the country, the MoCA had said then.
Use at railways: 310 railways stations in India have FRT-enabled CCTV cameras across the country, according to the Ministry of Railways response in Parliament in August. The ministry plans on expanding this network to 938 railway stations across the country.
Also read:
- Facial Recognition-equipped CCTVs installed at 310 railway stations: IT Minister
- More than 2,600 passengers have signed up for getting their faces scanned at Delhi airport
- Air Asia starts facial verification of passengers at Bengaluru International Airport
Have something to add? Post your comment and gift someone a MediaNama subscription.
I cover health technology for MediaNama but, really, love all things tech policy. Always willing to chat with a reader! Reach me at anushka@medianama.com
