UK’s communications regulator Ofcom has initiated an investigation into TikTok for providing the authority with inaccurate information about its parental control system. On July 6, 2023, Ofcom sent TikTok an information request notice which asked the company about the steps it is taking to prevent children from encountering harmful content wherein TikTok had failed to provide accurate information about its parental controls. According to a report by the Guardian, TikTok said that the inaccuracy was a result of a technical issue and that it identified the inaccuracy and informed Ofcom of the same. Ofcom said that it will come out with an update on the investigation by February 2024.
Under the UK’s video-sharing platform (VSP) regime, UK-based video-sharing platforms are required to put in place measures that protect children from coming across videos that “may impair their physical, mental, or moral development.” Further, under the UK’s Communications Act 2003, video-sharing platforms (like TikTok) have to comply with Ofcom’s request for information. Similar notices had also been sent to Snap and Twitch.
How are platforms protecting children?
Age verification: Ofcom’s report suggests that all three of the platforms are meant for those who are above the age of thirteen but allow users to declare their age without any verification of the same. This makes them susceptible to use by those under the age of thirteen as well. Further, it found that while TikTok and Snap could not be used without signing into the service, Twitch allowed for open access. This unrestricted access could lead to children discovering content that has otherwise been labelled as mature (meant for those over 18).
Identifying underage account holders: While TikTok uses keywords to identify which accounts could potentially be underage, it didn’t reveal any other details citing confidentiality. Snap assesses users’ ages using user reports and signals, including an account holder’s Snapchat usage and friends’ ages. Twitch uses an AI tool (Spirit AI) and analyzes suspicious data traffic to detect underage users. It also relies on human moderators to identify underage account holders.
Parental controls: The authority also reported that while TikTok and Snap had parental control measures in place, Twitch’s terms and conditions expected parents to supervise children while they are using the service. Twitch says it has had no contact from parents requesting the removal of unsupervised child accounts over the 12 months from August 2022 to August 2023.
Notably, Twitch has recently changed its terms and conditions, allowing users to post adult content on its services. While the company says that it will not display content labelled as including sexual themes on users’ homepages, allowing this content on its service might entail adding age-gating measures so that the service can still stay compliant with the video-sharing platform (VSP) regime and the Online Safety Act, which requires that children are protected from harmful social media content.
STAY ON TOP OF TECH NEWS: Our daily newsletter with the top story of the day from MediaNama, delivered to your inbox before 9 AM. Click here to sign up today!
Also read:
- Snapchat’s ‘My AI’ Chatbot In Hot Water With UK’s Information And Communication Officer Over Privacy Concerns
- UK’s Communications Regulator Releases Consultation On Age-Gating Childrens’ Access To Online Pornography
- CEOs Of X, Discord And Snap To Testify In A Hearing About Online Child Sexual Exploitation