wordpress blog stats
Connect with us

Hi, what are you looking for?

How Can Surveillance Technologies Perpetuate Police Abuse of Powers? A report explores

The report builds on an abolitionist and progressive perspective to limit police use of surveillance tech and build greater public accountability

The citizen-assisted police surveillance of neighbourhoods through CCTV cameras under the Bhopal Eye project in Madhya Pradesh “illustrates the mundane abuse of power typical of policing,” argue No Tech For Tyrant (NT4T) researchers in their new report, “Surveillance Tech Perpetuates Police Abuse of Power“.

Across seven global case studies, the report examines how police forces use surveillance technology to prevent and solve crimes “building on the myth that more police equals more safety”. In parallel, they find that these services harm marginalised communities already at the receiving end of over-policing, while weakening privacy rights and democratic values.

Why it matters: The report does not view technology as the source of problems—rather, drawing on critical race theory and abolitionist perspectives, it argues that it amplifies historical discrimination perpetuated by police forces. “Understanding policing powers through the lens of technology-augmented abuse helps us counter impunity in the use of digital technology against protestors and marginalised communities,” add the researchers.

Surveillance technologies are often used beyond their initial purpose, introducing a function creep, argue the researchers, based on the trends observed in the case studies. In other cases, they are often introduced secretively for obfuscated purposes. Lack of accountability structures makes mounting legal challenges to police abuses of surveillance technology difficult, they add. This is made worse by existing laws and legal institutions, which often do not “adequately protect” citizens against surveillance abuses.

These purchases and collaborations with private entities introduce a “profit motive to data collection” that benefits the surveillance industry. Data is increasingly collected by law enforcement agencies with no “clear purpose” in mind. However, the researchers are optimistic that resistance against surveillance is building, with legal challenges to these actions cropping up more and more.

No Tech For Tyrants (NT4T) is a UK-based collective interested in minimising technology’s role in oppressing marginalised groups. NT4T also works to dismantle “the nexus between universities, technology, and border enforcement”. The report was funded by the London School of Economics and Political Science’s “Justice and Equity Technology Table”, a collective working to “address the impacts of data-driven policing on racialised communities throughout Europe”.


FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.


Why does the Bhopal Eye indicate enhanced police abuses of surveillance powers?

What is the project?: The “Bhopal Eye”—announced by the city’s police in November 2019—involves residents in detecting crime and assisting crime prevention efforts. Citizens can “easily share” the live CCTV feeds from outside their homes or business with the police, who then purportedly use the visuals to solve and prevent crimes. Citizens can also report “suspicious” activities and individuals to the police through the app by sharing their geographical location, as well as the relevant CCTV details for the police to log-in remotely to view the footage.

Bhopal follows in the footsteps of Surat, which was the first city to adopt such a system. The app was developed by the non-profit Citizen COP Foundation, whose stated goals include developing a “technological bridge between citizens and the police”.

How does it impact marginalised groups?: Citizen-led CCTV surveillance may turn the spotlight towards low-level offences and blue-collar groups, surmise the NT4T researchers. Madhya Pradesh particularly exhibits systemic policing biases against poor and marginalised groups, including Scheduled Castes, Scheduled Tribes, Other Backward Classes, and religious minorities, say the researchers, citing a Criminal Justice and Police Accountability Project report on policing during the pandemic.

Can laws and institutions protect citizens?: Back in Bhopal, citizens have little to no recourse against such surveillance, despite the Supreme Court’s 2017 verdict in Puttaswamy v Union of India upholding the Right to Privacy, particularly against government surveillance. “Law enforcement (and government surveillance more broadly) is frequently exonerated in court, even in the rare cases where its legality is challenged,” argue the researchers.

How have other countries used abused surveillance technologies?

Using police systems within intimate relationships in the UK: Access to police databases can be used by the police force to further exacerbate abuse. “Police officers can and do use this access for abuses such as stalking of current, former, and future partners as well as family members,” the researchers observe. For example, a police system was used by a married officer in Sussex to search for a woman he wanted to pursue a romantic relationship with. Using systems this way violates police procedure and potentially, data protection laws too.

Using Pegasus to target Mexico’s civil society: Over 15,000 people were targeted by the Mexican government through NSO Group’s Pegasus spyware, including a minor, legislators, scientists, journalists, anti-corruption groups, and human rights lawyers. The chance of a fair investigation into the spyware’s use against citizens was slim—the law enforcement agency responsible for doing so was involved in procuring the spyware. “While [the] surveillance industry markets itself as providing governments with the means to investigate serious matters of crime and terrorism, their products have become a convenient tool for undermining public accountability,” argue the researchers.

Smart streetlights in San Diego: “Environmentally-friendly” smart streetlights were installed in California’s San Diego in 2017 to measure air quality and reduce electricity bills. By 2019, local law enforcement was requesting access to the street lights’ camera footage to address “serious crimes” like sexual assault, murders, and kidnappings, monitor vandalism and dumping, and surveil Black Lives Matter protestors. No formal policy or oversight mechanism was in place to govern the use of technology for such “serious crime”. Despite two 2020 ordinances to create a governing framework for surveillance technology purchases in San Diego, the city has continued purchasing “surveillance technology (in this case, a social media analytics service that tracks residents for “police” and “political” purposes) without public inquiry into its purpose or propriety”.

Facial Recognition Technology (FRT) in Rio de Janeiro: Since 2019, local police have been doubling down on facial recognition surveillance systems in the low-income regions or favelas of the city—particularly the Jacarezinho favela. While the systems were introduced to “protect” communities, as per police reports, they also seek to “produce proofs that corroborate reality and strengthen the affirmation of police innocence in future judicial trials”. These technologies are also being deployed without consulting local communities, which are often poor, black, marginalised, and criminalised. The police have also not discussed how the data will be stored or deleted. The racial biases of such systems can have “outsized consequences”, argue the researchers. In 2019, 90% of the Brazilians arrested via FRT were Black.

Sharing surveillance footage with reality TV producers in the UK: In the 2000s, the Thames Valley Police shared surveillance footage with “Road Wars” team— a reality show documenting police officers. Surveillance technology introduced used for “public safety” was instead further traded for entertainment, all the while disseminating “a police-controlled narrative of the events transpiring in the footage”. No legal remedies were in place to prohibit the police from selling the footage, lending the police more “power to shape and control the stories we consume about policing and criminality”, argue the researchers.

Palantir for personal-data processing in Denmark: The Danish government used Palantir Gotham software to run POL-INTEL—”a system to integrate and search through previously siloed [citizen and governance] databases” that can be used to make “intelligent data-based decisions”. With such a system comes the need to collect more data on citizens, which experts have warned can be used against marginalised groups. “In 2018, Denmark announced the controversially named “ghetto package”, which sought to intervene in areas with more non-western Danes than western Danes, because ‘[Western] Danes should not be a minority in any housing area’,” recall the researchers. “The ghetto package policies are enabled by massive databases such as the residence registrar, employment, taxation, education and criminal convictions [that eventually classify areas into ghettos].” Palantir was initially pitched as a software to combat terrorism—its current usage in Denmark has not received much public resistance.

What calls for action have No Tech For Tyrants suggested?

Researchers should stop recommending intrusive and repressive policing solutions to solve “social problems”. “Aim instead for research towards a generative, abolitionist project for a world where surveillance technology isn’t necessary,” recommend NT4T.

Technologists should avoid developing surveillance technology, and embed an abolitionist approach to their work. They should actively consider the long-term impacts of technologies while building them.

To reduce police power, civil society should campaign for “greater democratic oversight and community accountability measures on police purchase and use of surveillance technology”.

Important concepts undergirding the study

Abolitionism and policing: This school of thought focuses on abolishing prisons and the “Prison Industrial Complex”. On the one hand, reformists seek to perpetuate and expand policing by “reforming” the forces and training them better. On the other, abolitionists move to reduce the scope and harmful impacts of policing at large.

Critical race theory and surveillance: Racism isn’t an outcome of surveillance—rather it is the precondition that leads to the use of such technologies. Emerging from racial theories developed in the West, this school argues that “Surveillance of blackness has been a norm, long before technology (..) Surveillance technologies, especially used by the institution of policing, are inseparable from this long-existing order that extracts value from and abuses marginalised populations.”

Technology-enabled coercive control: Deliberately using technologies and systems to scare, coerce, harass, or stalk someone.

Abuse: The police’s abuse of its powers is institutional, argue the researchers, adding that racism and the need to target marginalised groups is a precondition for surveillance. More importantly, this surveillance need not be unlawful, however, they are nevertheless abusive, unethical, and oppressive, they note.


This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Read More

Written By

Free Reads

News

The ‘Reforming Intelligence and Securing America Act’ (RISAA) is a legislation to reauthorize Section 702 of the Foreign Intelligence Surveillance Act (FISA).

News

In its submission, the Interior Ministry said the decision to impose a ban was "made in the interest of upholding national security, maintaining public...

News

Among other things, the security requirements include data encryption and regular review and updated access permissions to reflect personnel changes.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...

News

Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...

News

The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...

News

Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...

News

Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ