Australia’s online safety regulator eSafety Commissioner has fined X (formerly Twitter) $385,000 ($610,500 Australian dollars) for failure to address questions about how it tackled child sexual exploitation content present on its platform. In February this year, the authority issued legal notices to X, Google, TikTok, Twitch, and Discord under Australia’s Online Safety Act. Under this act, companies are required to answer questions about measures they have in place to deal with the proliferation of child sexual exploitation content on their services.
The eSafety Commissioner said that among companies that were sent the notices, X and Google had failed to adequately address the regulator’s questions. While Google was issued a formal warning for providing generic answers and providing aggregated information when asked questions about specific services, X’s non-compliance was found more serious. The eSafety Commissioner said that X left certain sections of its questionnaire completely blank and in other instances, it gave incomplete or inaccurate answers. X has 28 days to request the withdrawal of the infringement notice or to pay the penalty.
Article continues below , you might also want to read:
- Here’s All You Need To Know About ‘Humans Of Bombay Suing People Of India Under Copyright’ Case
- Bombay HC Upholds Exclusion Of Internet Broadcasting From Copyright Statutory Licenses Provisions
- Confusion Among YouTube News Channels After Copyright Claims From Prasar Bharati
- The Authors Guild, George RR Martin, John Grisham, And Others Files Lawsuit Against OpenAI For Copyright Infringement
Key findings from questionnaires sent to other platforms:
- YouTube, Twitch, and TikTok are attempting to detect child sexual exploitation in live streams, but Discord is not.
- The three platforms are also using technology to detect grooming (actions or behaviors used to establish an emotional connection with a minor). X, Discord, and other Google services (Meet, Chat, Gmail, Messages) are not using any such technology.
- In the three months after X’s change in ownership in October 2022, the proactive detection of child sexual exploitation material fell from 90% to 75%. The company says that the detection rate has improved in 2023.
Global efforts to curb child sexual abuse content:
Regulators in different parts of the world have been becoming more vigilant about curbing child sexual abuse content (or CSAM) from online platforms. For instance, in India, the Ministry of Electronics and Information Technology (MeitY) issued notices to X, YouTube, and Telegram, to remove CSAM from their platforms on the Indian internet. India’s Union Minister of State for Electronics and Information Technology, Rajeev Chandrasekhar has warned that if platforms fail to act swiftly, their safe harbor under section 79 of the IT Act will be withdrawn and consequences under the Indian law will follow. In the UK, platforms have been asked to develop and deploy software that will scan phones for prohibited content. While CSAM is indeed a serious issue, both these approaches to curbing its spread can have grave implications for freedom of speech and right to privacy respectively.
Australia, however, has taken a different approach. It is holding platforms accountable by issuing fines as it has done in the case of X. If the platform fails to pay this fine, the eSafety Commissioner can take X to the federal court, which can fine the company up to $780,000 a day backdated to March this year. Speaking to the Guardian, the eSafety Commissioner Julie Inman Grant said that the authority has periodic notice powers and wants to put pressure on platforms to ensure that they are improving safety standards. How effective Australia’s approach would be, compared to other parts of the world is yet to be seen.
STAY ON TOP OF TECH NEWS: Our daily newsletter with the top story of the day from MediaNama, delivered to your inbox before 9 AM. Click here to sign up today!