Ofcom is concerned that smaller UK-based adult sites do not have robust measures in place to prevent children accessing pornography according to a report released by the regulator on October 20, 2022.
The report revealed that the platforms have age verification measures for when users sign up but they can circumvent these measures by simply self-declaring that they are over 18.
“It’s deeply concerning to see yet more examples of platforms putting profits before child safety. We have put UK adult sites on notice to set out what they will do to prevent children accessing them,” Ofcom’s Chief Executive Melanie Dawes said in a statement.
What is Ofcom: The Office of Communications is the UK’s communications regulator looking over broadband, home phone and mobile services, as well as TV and radio. It is an independent organisation funded by fees collected from companies it regulates.
What do people feel: Ofcom also conducted a survey which found that 81 per cent of respondents do not mind proving their age online in general, with 78 percent of them saying that they expect it for certain online activities.
- It also added that nearly 80 per cent of internet users said that it should be mandatory to verify their age when accessing pornography online, especially on dedicated adult sites.
- Ofcom said that they included a sample of UK internet users aged 18 or above and participants were selected based on whether or not they had accessed pornographic content online at any point in the past.
Ofcom’s warning: The regulator stated that adult sites will need to have a “clear roadmap to implementing robust age verification measures” in the next year. They will be exposed to enforcement action if they are not able to meet certain thresholds.
FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.
Why it matters: Ofcom claimed that the report is a first of its kind in the world which offers an insight on what video-sharing platforms in the UK are doing to protect users. Moreover, it highlights lacunae which allow kids to access adult sites and what can be done to address these issues.
You can read the entire report here.
Key highlights from the report
The scope of the report runs from October 2021 to October 2022. Here are some of its findings:
Efficacy of safety measures: The report said that all platforms had safety measures in place, including rules on what kinds of video material are allowed. However, it added that VSPs provided limited evidence on the efficacy of their measures because of which it was difficult for Ofcom to ascertain whether these measures are working consistently and effectively.
Prioritise risk assessment: Ofcom said that platforms were not prioritising risk assessment processes, but they believe they are fundamental to identifying and mitigating risks to user safety proactively. It said that it found most platforms did not understand what risk assessments are and how to perform them. They added that the task will be a requirement on all regulated services once the UK enforces the Online Safety Bill.
Reducing the risk of child sexual abuse material (CSAM): The report observed that self-generated content is an increasingly significant driver of child abuse images and videos. It said that the adult sites reported that they have some CSAM prevention measures in place. They said that these platforms have user rules in place to prohibit uploading illegal material, including CSAM. They contain clear sanctions for users who breach them.
Tackling hate and terror: Ofcom said that its survey found that 24 percent of users said they had come across videos they perceived to be violent, abusive, or inappropriate videos between August and October.
- It said that terms and conditions are adequate but there was some room for improvement as to what can be prohibited in their terms and conditions and how they communicate these rules to users.
- They added that some terms and conditions were dense and hard to follow and, in some cases, users do not ever need to open the terms and conditions in order to watch content.
Protections for kids below 18: The report said that users in this age group were significantly more likely than adults to say they had been exposed to content they perceived to be harmful online, including negative body image and eating disorder content, content glamourising unhealthy or abusive lifestyles, and the promotion of self-harm. It said that more work needs to be done on providing age-appropriate experiences to users under 18.
Age verification on adult sites: The regulator said that the platform, OnlyFans, has adopted an age verification solution but smaller adult sites had a “long way to go”. It said that the information received from the smaller platforms did not instill confidence about their access control measures to prevent children accessing pornography.
Reporting and flagging: Ofcom observed that reporting and flagging is in place on all platforms but its use and application varies, and it added that there is variation in how integral they are to platforms’ detection and enforcement processes. Additionally, most large VSPs have some form of external ‘trusted flagger’ programme where platforms partner with organisations (such as civil society, government agencies, or other relevant groups) who have a specific expertise in online harm. These ‘trusted flaggers’ can directly flag potentially violating content to the platforms, which then prioritise reviewing those reports.
International collaboration: The regulator wrote that international collaboration was “central” to its future ambitions. They said that there was value in working together with other regulators to highlight best practice, identify common risks, and support platforms. They said that Ofcom was in touch with regulators in France, Germany and Cyprus over their shared regulatory approaches to age verification.
What were some of the specific findings on smaller adult sites?
Ofcom wrote that a platform informed them that it had considered implementing robust access control measures to prevent users from under 18 from accessing pornographic material, but had decided not to as they believed that adding further restrictions would impede adults from accessing the platform and reduce the profitability of the business.
Decision-making structure: The report said that the decision-making processes within these small platforms appear to be relatively informal. It added a caveat that Ofcom does not have enough detail to understand how the platforms assess practicability and proportionality when deciding which protection measures need to be implemented.
What will be the priorities: Ofcom said that it sought to ensure platforms have sufficient processes in place for setting and revising comprehensive user policies that cover all relevant harms. It also added—
- Review the tools platforms provide to users to control their experience and promote greater engagement with these measures,
- Promote implementation of robust age assurance, to protect children from the most harmful online content (including pornography).
This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.
Also read:
- Meta agrees to sell Giphy to comply with UK antitrust order
- UK’s Ofcom to probe cloud services, personal communication apps, and smart devices
- Why the UK competition watchdog wants to investigate Microsoft’s Activision deal
- UK investigation: Should government officials use private communication apps like WhatsApp, Gmail?
I cover several beats such as Crypto, Telecom, and OTT at MediaNama. I can be found loitering at my local theatre when I am off work consuming movies by the dozen.
