wordpress blog stats
Connect with us

Hi, what are you looking for?

Roundup: Industry leaders, civil society, and technical experts react to Apple’s CSAM filter plans

While child rights groups have welcomed Apple’s announcement, Whatsapp’s CEO has expressed concern by calling it the wrong approach.  

Last Thursday, Apple announced a controversial plan to proactively scan all iPhone users’ backed-up images for known child sexual abuse material (CSAM). The move is significant for the iPhone maker, which frequently touts its phones’ privacy as a selling point, and has made end-to-end encryption integral to services like iMessage and FaceTime, which make it impossible for anyone to intercept messages in transit — even for Apple. Apple also announced steps like notifying parents if minors view or send what its systems detect as sensitive content, and warns users searching for CSAM that interest in the subject is harmful.

With this scanner, Apple will essentially be matching hashes of iCloud photos and videos with hashes of known CSAM material and will inform the authorities after the match results cross a certain threshold. While nothing has been announced about this software being deployed in India, the National Center for Missing and Exploited Children in the United States will receive these reports from Apple. NCMEC is a children’s safety advocacy group that works with law enforcement agencies in the United States.

How the tech industry is reacting

Will Cathcart, CEO, WhatsApp: The Facebook-owned messaging app’s CEO immediately expressed concern at Apple’s plans. “I think this is the wrong approach and a setback for people’s privacy all over the world,” Cathcart tweeted on Monday. “We reported more than 400,000 cases to NCMEC last year from WhatsApp, all without breaking encryption,” he added. (Read our coverage on how WhatsApp reports CSAM in India through NCMEC here). He termed Apple’s approach as bringing “something very concerning” into the world, adding that computing devices haven’t had such mandates in spite of existing for decades, and termed the move a form of surveillance:

Tim Sweeney, CEO, Epic Games: Sweeney is locked in a lawsuit with Google and Apple to reduce commissions on in-app purchases for Epic Games’s highly popular Fortnite franchise. “Apple’s dark patterns that turn iCloud uploads on by default, and flip it back on when moving to a new phone or switching accounts, exacerbate the problem. Further, in many contexts Apple has forced people to accumulate unwanted data, as with mandatory iCloud email accounts,” Sweeney argued in a thread where he called the scanner “government spyware”. “The existential threat here is an unholy alliance between government the monopolies who control online discourse and everyone’s devices, using the guise of private corporations to circumvent constitutional protections,” Sweeney said. He framed the scanner as a threat to democratic freedoms:

How child rights groups are reacting

Marita Rodriguez, Executive Director, Strategic Partnerships, NCMEC: In a letter circulated to Apple’s employees obtained by 9to5Mac, Rodriguez welcomed the scanner. “[E]veryone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection,” Rodriguez wrote to Apple. “We know that the days to come will be filled with the screeching voices of the minority. Our voices will be louder,” she added. “Thank you for finding a path forward for child protection while preserving privacy,” Rodriguez said.

Joanna Shields, Founder, WeProtect Global Alliance: “Apple’s new child safety features are proof that detecting CSAM and protecting users’ privacy can be compatible if we want them to be. A great example from one of the world’s major global technology leaders – I hope many others will now follow,” Shields tweeted. WeProtect is a public-private multistakeholder body whose mission is to “break down complex problems and develop policies and solutions to protect children from sexual abuse online.”

Advertisement. Scroll to continue reading.

Ashton Kutcher, Co-Founder, Thorn: Digital Defenders of Children: The actor tweeted that “I believe in privacy – including for kids whose sexual abuse is documented and spread online without consent. These efforts announced by Apple are a major step forward in the fight to eliminate CSAM from the internet.”

How digital rights groups are reacting

Electronic Frontier Foundation: The US-based digital rights nonprofit said in a blog post written by Director of Federal Affairs India McKinney and Senior Staff Technologist Erica Portnoy that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” arguing that “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts.”

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire. — Electronic Frontier Foundation

Greg Nojeim, Co-Director, Security & Surveillance Project, Center for Democracy & Technology: In a press release on CDTR’s website, Nojeim said that “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world.” He pointed out, “In particular, LGBTQ youth and children in abusive homes are especially vulnerable to injury and reprisals, including from their parents or guardians, and may inadvertently expose sensitive information about themselves or their friends to adults, with disastrous consequences.”

How experts are reacting

Alex Stamos, Adjunct Professor, Center for International Security and Cooperation, Stanford University: The computer scientist Stamos said in a Twitter thread that “I am both happy to see Apple finally take some responsibility for the impacts of their massive communication platform, and frustrated with the way they went about it. They both moved the ball forward technically while hurting the overall effort to find policy balance.” He added that “I have friends at both the EFF and NCMEC, and I am disappointed with both NGOs at the moment. Their public/leaked statements [see above] leave very little room for conversation, and Apple’s public move has pushed them to advocate for their equities to the extreme,” chiding the phone maker for not consulting other stakeholders before making this decision:

“I also don’t understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups. A reasonable target should be scanning shared iCloud albums, which could be implemented server-side.

In any case, coming out of the gate with non-consensual scanning of local photos, and creating client-side ML that won’t provide a lot of real harm prevention, means that Apple might have just poisoned the well against any use of client-side classifiers to protect users.” — Alex Stamos

Technical analyses*: In a series of technical assessments, Benny Pinkas, Deputy Director and Head of Scientific Committee, Department of Computer Science, Bar Ilan University; David Forsyth, Fulton Watson Copp Chair in Computer Science at the University of Illinois Urbana-Champaign; and Mihir Bellare, Professor, Department of Computer Science and Engineering, University of California San Diego; all opined that in terms of security, Apple’s systems were sound. Pinkas wrote, “I believe that the Apple PSI [Private Set Intersection] system provides an excellent balance between privacy and utility, and will be extremely helpful in identifying CSAM content while maintaining a high level of user privacy and keeping false positives to a minimum.”

Advertisement. Scroll to continue reading.

Forsyth wrote, “It is highly unlikely that harmless users will be inconvenienced or lose privacy because the false positive rate is low, and multiple matches are required to expose visual derivatives to Apple. Apple will review these potential reports and notify NCMEC if appropriate. Even if there is a false alert, this review will ensure that harmless users are not exposed to law enforcement actions.”

Bellare wrote, “Apple has found a way to detect and report CSAM offenders while respecting […] privacy constraints. When the number of user photos that are in the CSAM database exceeds the threshold, the system is able to detect and report this. Yet a user photo that is not in the CSAM database remains invisible to the system, and users do not learn the contents of the CSAM database.”

* Apple facilitated and made copies of these technical assessments available on its website.

Also read

 

Have something to add? Subscribe to MediaNama and post your comment

Advertisement. Scroll to continue reading.
Written By

I cover the digital content ecosystem and telecom for MediaNama.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

The US and other countries' retreat from a laissez-faire approach to regulating markets presents India with a rare opportunity.

News

When news that Walmart would soon accept cryptocurrency turned out to be fake, it also became a teachable moment.

News

The DSCI's guidelines are patient-centric and act as a data privacy roadmap for healthcare service providers.

News

In this excerpt from the book, the authors focus on personal data and autocracies. One in particular – Russia.  Autocracies always prioritize information control...

News

By Jai Vipra, Senior Resident Fellow at Vidhi Centre for Legal Policy The use of new technology, including facial recognition technology (FRT) by police...

You May Also Like

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ