London’s Metropolitan Police Service (Met), on January 24, said that it will start using live facial recognition at “specific locations” in the city. The police said that it will verify the identity of individuals, against a “bespoke watchlist” of people that are on its radar. This watchlist would include people wanted for serious offences including knife and gun crime as well as those with outstanding warrants, the police claimed. The police further claimed that each deployment of the system will have its own legitimate purpose, legal basis, and justification covering necessity and proportionality.

Japan’s NEC Corporation has helped the London police in developing the technology.

Met’s announcement comes a week after the European Commission announced its plans of banning facial recognition technology for five years. It is also worth mentioning that an independent assessment of Met’s facial recognition system in July last year had found out that the system got only 8 of 42 matches right — an accuracy rate of 19%. The assessment had also said that the criteria for including people in the “watchlist” were not clearly defined, and there was significant ambiguity over the categories of people the system wanted to identify. In September 2019, the High Court in Cardiff, UK, had ruled that it was lawful for the South Wales Police to use facial recognition technology to search for people in crowds, as it did not breach human rights or data protection laws.

Normalisation of facial recognition? Met’s Assistant Commissioner, Nick Ephgrave said, “We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point.” He added that similar technology was already widely used across the UK in the private sector. He also claimed that the Met’s “careful and considered” deployment of the facial recognition systems would ensure that people’s privacy and human rights are protected.

Explicit consent, or inferred consent? The facial recognition system would be deployed with “clear signage” to alert members of public that it is in operation, Met said. The police will publicise details of deployments online in advance, both on its corporate website, and through local borough communications channels. It further claimed that when the surveillance system would generate an “alert,” it would be at the discretion of a police officer to engage with a member from the public. Although, the Met did not clarify if they will seek explicit consent from the public at large before scanning their faces, or if the “clear signage” also means inferred consent.

  • The police claimed that these facial recognition systems would be standalone systems and would not be linked to any other imaging system, such as CCTV, body worn video or ANPR (automatic number-plate recognition).

Retention of biometric data: In case a person doesn’t match with Met’s watchlist, their biometric data would be immediately and automatically deleted. When there is an alert on the system, the biometric data of that person would be retained for evidential purposes. In case that a match doesn’t lead to a prosecution, or “there is no legitimate policing purpose for retention,” then that biometric data and corresponding CCTV footage will be retained for 31 days, the police said. This retention would allow the Met to carry out post-deployment assessment about the effectiveness of their system, it said.

  • Should information be required to investigate a complaint, the Met would hold any data relevant to that complaint as per its complaints procedure, following which that data will be deleted, it said.
  • For any longer-term police investigation or judicial processes, biometric data and CCTV footage will be retained as evidence for investigative purposes, and will be deleted after the conclusion of the investigation or judicial proceedings.
  • The Information Commissioner’s Office in the UK said that an appropriately governed, targeted and intelligence- led deployment of LFR (Live Facial Recognition) may meet the threshold of strict necessity for law enforcement purposes. It also said that the UK government should introduce a statutory and binding code of practice for LFR as a matter of priority.

Silkie Carlo, director of Big Brother Watch, a UK-based human rights advocacy group said that the use of facial recognition systems by the Met are an “enormous expansion of the surveillance state,” and pose a serious threat to civil liberties in the UK.

Certain cities in the USA have banned facial recognition systems: While the use of facial recognition systems continues to gain prominence, a handful of cities in the USA have banned the use of such systems, including San Francisco, Oakland, Cambridge, Berkley, and Somerville.

Police in India are already using facial recognition systems: Certain police forces in India are already using facial recognition systems to scan people at large gatherings, with the most recent instance of its deployment being at Prime Minister Narendra Modi’s rally in Delhi, in December 2019. The very legality of the use of facial recognition systems by the government has also been questioned in India.

  • Police in Telangana have been asking for “suspects” fingerprints and facial data to match against a database of criminals, although, as we had earlier reported, these cases often don’t involve an executive order, or explicit consent from the person whose biometric data is being demanded.
Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Download [9.61 KB]