By implementing an on-device matching process, Apple wants to detect and flag child sexual abuse material. But can this technology be used by governments for other purposes? Apple on Thursday announced three new measures coming to its operating systems this fall that aim to limit the spread of Child Sexual Abuse Material (CSAM) and protect children from predators: CSAM detection in iCloud Photos: Using advanced cryptographic methods, Apple will detect if a user's iCloud Photos library contains high-levels of CSAM content and pass on this information to law enforcement agencies. Safety measures in Messages: Messages app will warn children about sensitive content and allows parents to receive alerts if such content is sent or received. Safety measures in Siri and Search: Siri and Search will intervene when users try to search for CSAM-related topics and will also provide parents and children expanded information if they encounter unsafe situations. Why it matters? Last year, it was reported that India leads in the online generation of CSAM. Considering this, Apple's latest measures appear appreciable and harmless, but the technology used by Apple to implement these measures can evolve to be used for other privacy-invasive purposes. There will now be a burden of expectations on Android to do the same and it opens the door for all kinds of surveillance tools or content removal requests from governments. For example, the Indian government can ask platforms like WhatsApp to proactively remove photos that are critical of it using this same technology. How does CSAM detection in iCloud…
- “Foreign state actor” may be responsible for the ransomware attack on AIIMS-Delhi: Report December 3, 2022
- Agenda: Reworking The Data Protection Bill, Delhi, 8th Dec #Ad December 3, 2022
- Why has the deadline to comply with UPI market share cap been extended by the NPCI? December 3, 2022
- India’s IT Minister on DPDP Bill: Law should be kept ‘simple’, subordinate rules won’t exceed Act December 3, 2022
- MIB approves ninth self regulatory body, PADMA, under the IT Rules, 2021 December 3, 2022
MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.
The Structure and Style of a Dogma Community: Conspiracy theories and organized Twitter engagement on Sushant Singh Rajput
Studying the 'community' supporting the late Sushant Singh Rajput (SSR) shows how Twitter was gamed through organized engagement
Do we have an enabling system for the National Data Governance Framework Policy (NDGFP) aiming to create a repository of non-personal data?
A viewpoint on why the regulation of cryptocurrencies and crypto exchnages under 2019's E-Commerce Rules puts it in a 'grey area'
India's IT Rules mandate a GAC to address user 'grievances' , but is re-instatement of content removed by a platform a power it should...
Why ‘group privacy’ should be recognised, and how ‘non-personal’ data becomes a regulatory blindspot
There is a need for reconceptualizing personal, non-personal data and the concept of privacy itself for regulators to effectively protect data
Please subscribe to MediaNama. Don't share prints and PDFs.
You May Also Like
Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...
135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...
Twitter takes down tweets from MP, MLA, editor criticising handling of pandemic upon government request
By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...