Eurostar passengers will soon have the option to go through a face verification process allowing them to walk through a “facial biometric corridor” to board a train, without requiring a passport or other travel documents. The solution is being developed by British technology company iProov in partnership with Eurostar and Canadian travel company WorldReach Software. It will first go live at London’s St Pancras International station by the end of March 2021. The Financial Times first reported this, and said that the system will be opt-in.

Before travelling, passengers will have to scan their identity documents using the Eurostar app, and according to FT, will also have to upload an image of their face to verify their identity. The facial biometric check uses “controlled illumination” to authenticate a person’s identity, iProov said. The illumination also checks the image to ensure that it includes a real person rather than a photo, video, mask, or even a deepfake. After authentication, the passenger will receive a confirmation message that their identity document has been secured, following which they won’t require a physical copy of their ticket or passport until they reach their destination. For people who don’t have a smartphone, a kiosk at the rail station will allow them to carry out a similar process.

There are a few things which are unclear about iProove’s facial verification algorithm, such as its accuracy rate, and the database on which its facial recognition model has been trained.

The problem with facial recognition tech: The announcement comes as facial recognition technology has come under fire, especially in the US, particularly for being biased against people of colour and other underrepresented communities. The tool gives unprecedented power to law enforcement agencies, and can potentially lead to racial profiling and targeting. Following the backlash against the technology in the US after the murder of George Floyd, IBM announced that it would stop selling its “general purpose” facial recognition system, and Amazon and Microsoft announced a moratorium on selling their respective tech to police in the US, calling for a federal regulation. Apart from that, there are also questions around the accuracy rates of such systems today, since they have often found to malfunction. Case in point being when Brussels Airport scrapped its facial authentication system, as it frequently misidentified people.

Meanwhile in India, facial recognition systems have been active at several major Indian airports, including the Delhi airport, where more than 2,600 people have opted to go through the system as of last year. These systems at airports have been installed under the DigiYatra initiative, at access areas of the airport, including airport entry, security check entry and boarding gate entry.

  • The National Crime Records Bureau (NCRB) is inviting bids to create a national level Automated Facial Recognition System (AFRS), which is expected to be the foundation for “a national level searchable platform of facial images”. The department is planning to deploy such a system when India doesn’t have a data protection law.
  • Earlier this year, we reported that the Indian Railways is in the process of installing Video Surveillance Systems (VSS), equipped with a facial recognition system, in 983 railway stations across the country. In fact, South Western Railway is planning to implement this system at its railway stations from February 2020.
  • Telangana’s election commission piloted a facial recognition app in its civic elections on January 22, and claimed that it could address the issue of voter impersonation. The All India Majlis-E-Ittehadul Muslimeen (AIMIM), had urged the state’s election commissioner to withdraw the use of such systems, to no avail.
  • Certain police forces in India are already using facial recognition systems to scan people at large gatherings, with the most recent instance of its deployment being at Prime Minister Narendra Modi’s rally in Delhi, in December 2019.
  • Police in Telangana have been asking for “suspects’” fingerprints and facial data to match against a database of criminals, although, as we had earlier reported, these cases often don’t involve an executive order, or explicit consent from the person whose biometric data is being demanded.