What's the news: "Some of the world’s biggest technology companies are not doing enough to tackle child sexual exploitation on their platforms," Australia’s eSafety Commissioner said in a press release dated December 15. They have "inadequate and inconsistent use of technology to detect child abuse material and grooming, and slow response times when this material is flagged by users," the release said. Earlier in August, Australia’s newly established eSafety Commissioner issued legal notices to Apple, Meta (and WhatsApp), Microsoft (and Skype), Snap and Omegle asking them to explain the measures they are taking to tackle the proliferation of child sexual abuse material (CSAM) on their platforms and services. Based on the responses submitted by the platforms, the Commissioner has published a new report that examines the various tools and processes used by the platforms and their effectiveness. Why does this matter: CSAM is a serious problem and there is universal consensus that it needs to stay off platforms, but there is no consensus on how companies should go about ensuring this because there are privacy and wrongful identification concerns among others. The report by the eSafety Commissioner sheds light on what different platforms are (or are not) doing. "eSafety received responses from all the providers, giving valuable insights that were not previously available through voluntary initiatives, including providers’ own transparency reports. The responses showed a wide variation in the steps taken to address child sexual exploitation and abuse." — eSafety Commissioner What does the report contain: The report outlines the various tools used by platforms…
