While Apple has said that it has previously refused government demands that degrade the privacy of users, the company's track record in China indicates the contrary. Apple will refuse demands from governments to use its child sexual abuse material (CSAM) detection system for non-CSAM images, the company said in a supporting document released this week. Why it matters? Last Thursday, Apple announced a controversial plan to proactively scan iPhone users’ photos uploaded to iCloud for known CSAM and alert law enforcement agencies if a user’s iCloud Photos library contains high-levels of CSAM content. Apple's plans came under heavy criticism from privacy advocates with many arguing that governments can ask Apple to use this same technology to censor other kinds of content, including suppressing voices that are critical of the government. Apple's response that it will not allow this to happen offers some assurance but doesn't go far enough to guarantee the same. "This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable." — WhatsApp CEO Will Cathcart To learn more about how the CSAM detection technology works and what some of the common concerns and questions are, read here. To read about what industry leaders, technical experts, and civil society have said, read here. What exactly did Apple say? Could governments force Apple to add non-CSAM images to the…
