Members of the United States Congress on Thursday introduced a bill which proposes to prohibit federal agencies and officials from acquiring, possessing, accessing, and even using “any biometric surveillance system”, such as facial recognition technology. The bill even stops federal agencies from using information derived by such systems operated by other entities. It also proposes to withhold federal funding through the Byrne grant program for state and local governments that use the technology.

Called the Facial Recognition and Biometric Technology Moratorium Act, 2020, the bill has been drafted by senators Ed Markey, Jeff Merkley, and representatives Ayanna Pressley, and Pramila Jaypal. The bill follows an incident in Detroit where a black man was wrongfully arrested by the police based on incorrect information provided by a facial recognition system. The bill also comes only days after the city of Boston banned the use of facial recognition technology by the city’s government, and prohibited any city official from obtaining the technology from third parties. Senator Merkley has previously co-drafted a similar bill, seeking a moratorium on the use of facial recognition technology.

“Introduced on the same day that the House is set to pass the George Floyd Justice in Policing Act, our legislation will not only protect civil liberties but it will aggressively fight back against racial injustice by stopping federal entities from using facial recognition tools and stripping support for state and local law enforcement departments that use biometric technology,” congresswoman Jaypal said in a statement.

Provisions of the bill

Information from biometric systems can’t be used in judicial proceedings: The bill also prohibits the use of information collected via biometric technology in violation of the Act in any judicial proceedings. It also gives grounds to citizens, aggrieved by violation of the bill, to start legal proceedings against a federal official who used biometric surveillance on them. The bill also allows the chief law enforcement officer of a state to initiate civil action on behalf of residents who have been affected by use of biometric surveillance in them.

Penalties for violating the bill: Any Federal official who is found to have violated the bill may be subject to retraining, suspension, termination, or any other penalty, as determined in an appropriate tribunal.

Bars law enforcement agencies from using federal funds to buy biometric surveillance systems: No federal funds may be used by a federal law enforcement agency for the purchase or use of biometric surveillance systems. It is also barred from using unallocated funds already appropriated to it.

Exceptions: The only time federal agencies can use facial recognition or other biometric surveillance tools is when they have been explicitly authorised to do so by an Act of Congress. Even then, any such Act will have to specify the entities permitted to use the biometric surveillance systems, types of biometric authorised, purposes for such use, and any prohibited uses. It’ll also have to define standards of data retention, sharing, access, and audit trails, among other things.

  • The US’ National Institute of Standards and Technology, can continue testing and researching biometric surveillance systems or other remote biometric recognition technologies that are in commercial use.

The protests in the US against racial discrimination, following the murder of George Floyd at the hands of police, have forced companies to take a stand against facial recognition, especially because the technology is known to be biased, particularly against people of colour and other underrepresented communities. Microsoft has said that it will not sell the tech to police in the US until a federal law, while Amazon has committed to doing the same, albeit just for a year. IBM has said that it will altogether stop offering “general-purpose facial recognition and analysis software”. Amazon’s facial recognition tool Rekognition, for instance, misidentified 28 members of Congress as criminals. Research, in general, has shown that facial recognition tools are worse at detecting and identify faces of darker-skinned people, thereby creating ample room for discrimination and persecution.