In a landmark judgement, the UK’s Court of Appeal, on Tuesday, ruled that South Wales Police’s use of facial recognition technology was unlawful as it breached privacy rights, data protection laws and equality laws. The South Wales Police has confirmed to the court that it will not appeal against this judgment.

With this, the Court of Appeal overturned a September 2019 ruling by the High Court in Cardiff, UK, which said that SWP’s use of facial recognition technology was lawful and that it complied with both the Human Rights Act and the Data Protection Act of 2018. The case was brought to court by Cardiff resident Ed Bridges, who was supported by civil liberties organisation Liberty.

What the court held: The Court held that although the legal framework comprised primary legislation (DPA 2018), secondary legislation (The Surveillance Camera Code of Practice), and local policies promulgated by SWP, there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist.

It also said that the High Court was wrong to hold that SWP provided an adequate data protection impact assessment (DPIA) as required by section 64 of the DPA 2018. The court held that the DPIA submitted by the police was deficient.

The South Wales Police also did not comply to the public sector Equality Duty (PSED). The court held that the purpose of the PSED was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. “SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds”, the court held, while noting that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.

How the case started: This case was concerned with AFR Locate, developed by NEC Corporation, which, when deployed, takes “digital images of faces of members of the public” from live CCTV feeds and process them in “real time to extract biometric information”. This information is then compared with facial biometric information of people on a watchlist.

Bridges had brought up the case after he believed that his face was scanned by the SWP twice. He had argued that:

  1. No proper legal safeguards governing the use of AFR technology, which compares biometric data of individuals to a police database.
  2. Scanning people’s faces using FRT is akin to taking someone’s DNA or fingerprints without their consent
  3. Deployment of facial recognition technology could affect the human rights of thousands of people.

What they said: “This technology is an intrusive and discriminatory mass surveillance tool. For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance,” said Ed Bridges following the ruling by the Court of Appeal.

“The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties. Facial recognition discriminates against people of colour, and it is absolutely right that the Court found that South Wales Police had failed in their duty to investigate and avoid discrimination,” said Liberty’s lawyer Megan Goulding. “It is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.”

UK and Australia’s joint investigation into Clearview AI: The governments of Australia and the UK, in July, opened a joint investigation into controversial face recognition company Clearview AI. The Information Commissioner’s of the two countries will investigate the “personal information handling practices” of Clearview AI, focussing on its use of ‘scraped’ data and biometrics of individuals.