After facing pressure from privacy advocates over its new child safety features, Apple on September 3 said that it will delay its implementation, but advocates are asking Apple to abandon the plan entirely. These features have come under heavy criticism because many believe that the same technologies can be used by governments for censorship or by parents to undermine the privacy and freedom of expression of children.
Nation-wide protest planned in the US
Activists from the Electronic Frontier Foundation (EFF), Fight for the Future, and other civil liberties organizations have called for a nationwide protest in the US against Apple’s “plan to install dangerous mass surveillance software.” The protest is scheduled for September 13, a day before Apple’s big iPhone launch event.
“EFF is pleased Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools. But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely,” the organisation said.
“There is no safe way to do what Apple is proposing. Creating more vulnerabilities on our devices will never make any of us safer.”- Caitlin Seeley George, campaign director at Fight for the Future
On September 8, privacy advocates submitted nearly 60,000 petitions to Apple. Previously, on August 19, an international coalition of 90+ civil society organizations wrote an open letter to Apple, calling on the company to abandon its plans. “We are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter said.
What were Apple’s plans to combat CSAM?
Apple on August 5 announced new measures coming to its operating systems this month that aim to limit the spread of child sexual abuse material (CSAM) and protect children from predators. Among them is a feature that allows Apple to use advanced cryptographic methods to detect if a user’s iCloud Photos library contains high-levels of CSAM content and another feature where the iPhone Messages app will warn children about sexually explicit content and allows parents to receive alerts if such content is sent or received.
How does the iCloud CSAM detection feature work? Before an image is stored in iCloud Photos, Apple will convert the image to a unique hash and an on-device matching process checks this hash with a database of known CSAM image hashes provided by child safety organisations. The device will upload this match result along with the image to iCloud, but Apple will not be able to see this information just yet. Once an account crosses a predefined threshold of known CSAM content, Apple can access the matched images and it will manually review and confirm each match report before disabling the user’s account and notifying law enforcement authorities.
To learn more about how the CSAM detection technology works and what some of the concerns and questions are, read here.
Why were Apple’s plans controversial?
Apple’s latest measures appear appreciable and harmless, but the technology used by Apple to implement these measures can evolve to be used for other privacy-invasive purposes. It opens the door for all kinds of surveillance tools or content removal requests from governments. For example, the Indian government can ask platforms like WhatsApp to proactively remove photos that are critical of it using this same technology.
“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” WhatsApp CEO Will Cathcart tweeted.
As far as the safety measures in Messages go, civil rights organisations argued that the “system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk.”
To read more about what industry leaders, technical experts, and civil society have said, read here.
What has Apple said so far?
A few days after the new features were announced, Apple defended its plan and released a supporting document to address some of the concerns including saying that it will refuse demands from governments to use its CSAM detection system for non-CSAM images. A couple of days after this, on August 13, Apple software chief Craig Federighi in an interview with The Wall Street Journal once again defended the company’s plans but said that the initial announcement contained two different tools that were “jumbled up” and hence caused confusion.
On September 3, when announcing that it is putting its plan on hold, Apple said that it has “decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
- How WhatsApp Deals With Child Sexual Abuse Material Without Breaking End To End Encryption
- Instagram Announces Three New Safety Measures For Young Users, Including Limiting Advertisers’ Reach
- India Leads In Generation Of Online Child Sexual Abuse Material