wordpress blog stats
Connect with us

Hi, what are you looking for?

Apple says it will not allow governments to use its CSAM detection system for other images, but assurance doesn’t go far enough

While Apple has said that it has previously refused government demands that degrade the privacy of users, the company’s track record in China indicates the contrary. 

Apple will refuse demands from governments to use its child sexual abuse material (CSAM) detection system for non-CSAM images, the company said in a supporting document released this week.

Why it matters? Last Thursday, Apple announced a controversial plan to proactively scan iPhone users’ photos uploaded to iCloud for known CSAM and alert law enforcement agencies if a user’s iCloud Photos library contains high-levels of CSAM content. Apple’s plans came under heavy criticism from privacy advocates with many arguing that governments can ask Apple to use this same technology to censor other kinds of content, including suppressing voices that are critical of the government. Apple’s response that it will not allow this to happen offers some assurance but doesn’t go far enough to guarantee the same.

“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.” — WhatsApp CEO Will Cathcart

To learn more about how the CSAM detection technology works and what some of the common concerns and questions are, read here. To read about what industry leaders, technical experts, and civil society have said, read here.

What exactly did Apple say?

Could governments force Apple to add non-CSAM images to the hash list?

Advertisement. Scroll to continue reading.

“Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC [National Center for Missing and Exploited Children] and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.” — Apple

Can non-CSAM images be “injected” into the system to flag accounts for things other than CSAM?

“Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design.” — Apple

Does this address the concerns?

Apple’s statements on an official document do offer some assurance that the technology will not be misused but they do not go far enough for one simple reason, it has previously made compromises on privacy to appease governments and law enforcement agencies.

Apple’s compromises in China: Despite portraying itself as a champion of privacy, Apple has risked its customers’ data and aided government censorship in China in a bid to appease the authorities there. Some of the compromises include:

  • Storing customer data on Chinese government servers
  • Using different encryption technology for customer data
  • Sharing customer data with the Chinese government
  • Proactively removing apps that might offend Chinese authorities
  • Approving almost all of the Chinese government’s app-takedown requests

Apple dropped plan to encrypt iCloud backups: Earlier this year, Reuters reported that Apple dropped plans to fully encrypt iCloud backups after the FBI complained that the move would harm investigations. “It shows how much Apple has been willing to help U.S. law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers’ information,” the report stated.

What if governments mandate it through law? Many of the compromises that Apple has made in China are partly because of the laws that exist in the country. For example, India’s new Information Technology Rules 2021 requires platforms to develop tools to proactively remove content that the government deems illegal, which could merely be content that is critical of the government. Platforms have maintained that this will harm privacy and free speech but if Apple can implement proactive measures for CSAM, the Indian government can demand that the same technology be modified to accommodate the government’s requests. The government can also ask other platforms like WhatsApp to implement similar technology to find images and videos that are deemed to be illegal by the government.

Also Read

Have something to add? Subscribe to MediaNama and post your comment

Advertisement. Scroll to continue reading.
Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

The US and other countries' retreat from a laissez-faire approach to regulating markets presents India with a rare opportunity.

News

When news that Walmart would soon accept cryptocurrency turned out to be fake, it also became a teachable moment.

News

The DSCI's guidelines are patient-centric and act as a data privacy roadmap for healthcare service providers.

News

In this excerpt from the book, the authors focus on personal data and autocracies. One in particular – Russia.  Autocracies always prioritize information control...

News

By Jai Vipra, Senior Resident Fellow at Vidhi Centre for Legal Policy The use of new technology, including facial recognition technology (FRT) by police...

You May Also Like

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ