Amazon is halting police use of its controversial facial recognition system, Rekognition, for a year, the company announced in a brief blog post on June 10. The company didn’t reveal any detail about its plans with Rekognition during the moratorium period, but said it “might give Congress enough time to implement appropriate rules”, and the company “stand[s] ready to help if requested”. During the moratorium, it will continue providing the software to rights organisations such as Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics.

Amazon’s announcement comes just two days after IBM announced that it will stop offering “general purpose facial recognition and analysis software”. Although, unlike IBM — which raised issues of mass surveillance and racial profiling — Amazon did not offer any explicit reason about why it decided to stop selling Rekognition to the police for a year.

What about law enforcement agencies that are not the police? Amazon also did not explicitly specify whether it will pause sales to other law enforcement agencies during the moratorium. This is important because Rekognition is already licensed by a number of law enforcement agencies in the US. It is also not clear whether Amazon will stop developing the facial recognition system during the moratorium. It is also possible that this moratorium doesn’t mean much, especially since the company has never revealed how many police departments use its service. In fact, until last year, only one police department in Oregon was reportedly using it, but even that stopped. Also, it is not clear if Amazon will stop selling the tool to police during this year-long period.

What happens to Ring’s facial recognition plans? Police across the US have been partnering with Amazon’s Ring, a security doorbell camera, and have even been endorsing the product. The company had revealed last year that it has contemplated adding a facial recognition feature to its Ring doorbell cameras. It’s not clear where that plan stands now and if the company will continue working towards it during the moratorium period.

Amazon Rekognition was proven to be inaccurate: Amazon had come under fire for its face recognition tool in 2018, when the American Civil Liberties Union (ACLU) showed that Rekognition misidentified 28 members of Congress as criminals. Research, in general, has shown that facial recognition tools are worse at detecting and identify faces of darker-skinned people, thereby creating ample room for discrimination and persecution. ACLU, while welcoming the move, said that Amazon must “commit to a full stop of its face recognition sales until the dangers can be fully addressed. It must also urge Congress and legislatures across the country to pause law enforcement use of the technology”.

Closer home in India, the Vadodara City Police is planning to use Clearview AI’s controversial facial recognition software in public places such as railway stations and bus depots, and to track “property offenders”.