wordpress blog stats
Connect with us

Hi, what are you looking for?

UK Court rules Welsh Police’s use of face recognition tech as unlawful

In a landmark judgement, the UK’s Court of Appeal, on Tuesday, ruled that South Wales Police’s use of facial recognition technology was unlawful as it breached privacy rights, data protection laws and equality laws. The South Wales Police has confirmed to the court that it will not appeal against this judgment.

With this, the Court of Appeal overturned a September 2019 ruling by the High Court in Cardiff, UK, which said that SWP’s use of facial recognition technology was lawful and that it complied with both the Human Rights Act and the Data Protection Act of 2018. The case was brought to court by Cardiff resident Ed Bridges, who was supported by civil liberties organisation Liberty.

What the court held: The Court held that although the legal framework comprised primary legislation (DPA 2018), secondary legislation (The Surveillance Camera Code of Practice), and local policies promulgated by SWP, there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist.

It also said that the High Court was wrong to hold that SWP provided an adequate data protection impact assessment (DPIA) as required by section 64 of the DPA 2018. The court held that the DPIA submitted by the police was deficient.

The South Wales Police also did not comply to the public sector Equality Duty (PSED). The court held that the purpose of the PSED was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. “SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds”, the court held, while noting that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.

Advertisement. Scroll to continue reading.

How the case started: This case was concerned with AFR Locate, developed by NEC Corporation, which, when deployed, takes “digital images of faces of members of the public” from live CCTV feeds and process them in “real time to extract biometric information”. This information is then compared with facial biometric information of people on a watchlist.

Bridges had brought up the case after he believed that his face was scanned by the SWP twice. He had argued that:

  1. No proper legal safeguards governing the use of AFR technology, which compares biometric data of individuals to a police database.
  2. Scanning people’s faces using FRT is akin to taking someone’s DNA or fingerprints without their consent
  3. Deployment of facial recognition technology could affect the human rights of thousands of people.

What they said: “This technology is an intrusive and discriminatory mass surveillance tool. For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance,” said Ed Bridges following the ruling by the Court of Appeal.

“The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties. Facial recognition discriminates against people of colour, and it is absolutely right that the Court found that South Wales Police had failed in their duty to investigate and avoid discrimination,” said Liberty’s lawyer Megan Goulding. “It is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.”

UK and Australia’s joint investigation into Clearview AI: The governments of Australia and the UK, in July, opened a joint investigation into controversial face recognition company Clearview AI. The Information Commissioner’s of the two countries will investigate the “personal information handling practices” of Clearview AI, focussing on its use of ‘scraped’ data and biometrics of individuals.

Advertisement. Scroll to continue reading.
Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

The DSCI's guidelines are patient-centric and act as a data privacy roadmap for healthcare service providers.

News

In this excerpt from the book, the authors focus on personal data and autocracies. One in particular – Russia.  Autocracies always prioritize information control...

News

By Jai Vipra, Senior Resident Fellow at Vidhi Centre for Legal Policy The use of new technology, including facial recognition technology (FRT) by police...

News

By Stella Joseph, Prakhil Mishra, and Yash Desai The Government of India circulated proposed amendments to the Consumer Protection (E-Commerce) Rules, 2020 (“E-Commerce Rules”) which...

News

By Rahul Rai and Shruti Aji Murali A little less than a year since their release, the Consumer Protection (E-commerce) Rules, 2020 is being amended....

You May Also Like

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ