What’s the news: Australia’s newly establsihed eSafety Commissioner on August 30 issued legal notices to Apple, Meta (and WhatsApp), Microsoft (and Skype), Snap and Omegle asking them to explain the measures they are taking to tackle the proliferation of child sexual abuse material (CSAM) on their platforms and services.
“Some of the most harmful material online today involves the sexual exploitation of children and, frighteningly, this activity is no longer confined to hidden corners of the dark web but is prevalent on the mainstream platforms we and our children use every day. As more companies move towards encrypted messaging services and deploy features like livestreaming, the fear is that this horrific material will spread unchecked on these platforms. Child sexual exploitation material that is reported now is just the tip of the iceberg – online child sexual abuse that isn’t being detected and remediated continues to be a huge concern.” — eSafety Commissioner Julie Inman Grant
Why does this matter: There is universal consensus that CSAM needs to stay off platforms but there is no consensus on how companies should go about it because there are privacy and wrongful identification concerns among others. It will be interesting to see the different approaches that various tech platforms are currently taking and whether or not they are doing enough.
FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.
Why now: “We have seen a surge in reports about this horrific material since the start of the pandemic, as technology was weaponised to abuse children. The harm experienced by victim-survivors is perpetuated when platforms and services fail to detect and remove the content,” the commissioner said. “We know there are proven tools available to stop this horrific material being identified and recirculated, but many tech companies publish insufficient information about where or how these tools operate, and too often claim that certain safety measures are not technically feasible. Industry must be upfront on the steps they are taking, so that we can get the full picture of online harms occurring and collectively focus on the real challenges before all of us,” the commissioner added. The notices have been sent under the Australian Government’s new Basic Online Safety Expectations, which took effect in January this year.
Some notable platforms missing: Interestingly, some major platforms like Google (YouTube) and Twitter, and TikTok are missing from the list of platforms who have been sent notices. The regulator, however, noted that it will issue notices to more platforms in due course.
How big is the problem of CSAM: “The spread of child sexual exploitation material online is a global scourge; last year 29.1 million reports were made to the National Centre for Missing and Exploited Children,” eSafety noted. “Key safety risks for child sexual exploitation and abuse include the ability for adults to contact children on a platform, as well as features such as livestreaming, anonymity, and end-to-end encryption. This has lent itself to range of proliferating harms against children including online grooming, sexual extortion and coerced, self-produced child sexual exploitation material,” the regulator added.
Penalty for non-response: Companies who do not respond to notices within 28 days could face financial penalties of up to $555,000 a day.
This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.
Also Read
- Google Disables Two Accounts For Wrongly Assumed Child Sexual Abuse Photos: What This Means
- Why EU’s New Rules To Combat Child Sexual Abuse Is Facing Strong Criticism
- Apple Says It Will Not Allow Governments To Use Its CSAM Detection System For Other Images, But Assurance Doesn’t Go Far Enough
- India Leads In Generation Of Online Child Sexual Abuse Material
