wordpress blog stats
Connect with us

Hi, what are you looking for?

Privacy advocates want Apple to abandon its new child safety features entirely, not just delay them

Apple has said that it will delay its implementation of measures to address CSAM material, but advocates are asking Apple to abandon the plan entirely.

After facing pressure from privacy advocates over its new child safety features, Apple on September 3 said that it will delay its implementation, but advocates are asking Apple to abandon the plan entirely. These features have come under heavy criticism because many believe that the same technologies can be used by governments for censorship or by parents to undermine the privacy and freedom of expression of children.

Nation-wide protest planned in the US

Activists from the Electronic Frontier Foundation (EFF), Fight for the Future, and other civil liberties organizations have called for a nationwide protest in the US against Apple’s “plan to install dangerous mass surveillance software.” The protest is scheduled for September 13, a day before Apple’s big iPhone launch event.

“EFF is pleased Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools. But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely,” the organisation said.

“There is no safe way to do what Apple is proposing. Creating more vulnerabilities on our devices will never make any of us safer.”- Caitlin Seeley George, campaign director at Fight for the Future

On September 8, privacy advocates submitted nearly 60,000 petitions to Apple. Previously, on August 19, an international coalition of 90+ civil society organizations wrote an open letter to Apple, calling on the company to abandon its plans. “We are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter said.

What were Apple’s plans to combat CSAM?

Apple on August 5 announced new measures coming to its operating systems this month that aim to limit the spread of child sexual abuse material (CSAM) and protect children from predators. Among them is a feature that allows Apple to use advanced cryptographic methods to detect if a user’s iCloud Photos library contains high-levels of CSAM content and another feature where the iPhone Messages app will warn children about sexually explicit content and allows parents to receive alerts if such content is sent or received.

Advertisement. Scroll to continue reading.

How does the iCloud CSAM detection feature work? Before an image is stored in iCloud Photos, Apple will convert the image to a unique hash and an on-device matching process checks this hash with a database of known CSAM image hashes provided by child safety organisations. The device will upload this match result along with the image to iCloud, but Apple will not be able to see this information just yet. Once an account crosses a predefined threshold of known CSAM content, Apple can access the matched images and it will manually review and confirm each match report before disabling the user’s account and notifying law enforcement authorities.

To learn more about how the CSAM detection technology works and what some of the concerns and questions are, read here

Why were Apple’s plans controversial?

Apple’s latest measures appear appreciable and harmless, but the technology used by Apple to implement these measures can evolve to be used for other privacy-invasive purposes. It opens the door for all kinds of surveillance tools or content removal requests from governments. For example, the Indian government can ask platforms like WhatsApp to proactively remove photos that are critical of it using this same technology.

“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” WhatsApp CEO Will Cathcart tweeted.

As far as the safety measures in Messages go, civil rights organisations argued that the “system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk.”

To read more about what industry leaders, technical experts, and civil society have said, read here.

Advertisement. Scroll to continue reading.

What has Apple said so far?

A few days after the new features were announced, Apple defended its plan and released a supporting document to address some of the concerns including saying that it will refuse demands from governments to use its CSAM detection system for non-CSAM images. A couple of days after this, on August 13, Apple software chief Craig Federighi in an interview with The Wall Street Journal once again defended the company’s plans but said that the initial announcement contained two different tools that were “jumbled up” and hence caused confusion.

On September 3, when announcing that it is putting its plan on hold, Apple said that it has “decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Also Read

Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

The DSCI's guidelines are patient-centric and act as a data privacy roadmap for healthcare service providers.

News

In this excerpt from the book, the authors focus on personal data and autocracies. One in particular – Russia.  Autocracies always prioritize information control...

News

By Jai Vipra, Senior Resident Fellow at Vidhi Centre for Legal Policy The use of new technology, including facial recognition technology (FRT) by police...

News

By Stella Joseph, Prakhil Mishra, and Yash Desai The Government of India circulated proposed amendments to the Consumer Protection (E-Commerce) Rules, 2020 (“E-Commerce Rules”) which...

News

By Rahul Rai and Shruti Aji Murali A little less than a year since their release, the Consumer Protection (E-commerce) Rules, 2020 is being amended....

You May Also Like

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ