wordpress blog stats
Connect with us

Hi, what are you looking for?

Why is the UK’s communications regulator, Ofcom, not happy with age verification on adult sites?

Ofcom says that measures taken by adult sites hosting pornography are not adequate when it comes to kids’ access

Ofcom is concerned that smaller UK-based adult sites do not have robust measures in place to prevent children accessing pornography according to a report released by the regulator on October 20, 2022.

The report revealed that the platforms have age verification measures for when users sign up but they can circumvent these measures by simply self-declaring that they are over 18.

“It’s deeply concerning to see yet more examples of platforms putting profits before child safety. We have put UK adult sites on notice to set out what they will do to prevent children accessing them,” Ofcom’s Chief Executive Melanie Dawes said in a statement.

What is Ofcom: The Office of Communications is the UK’s communications regulator looking over broadband, home phone and mobile services, as well as TV and radio. It is an independent organisation funded by fees collected from companies it regulates.

What do people feel: Ofcom also conducted a survey which found that 81 per cent of respondents do not mind proving their age online in general, with 78 percent of them saying that they expect it for certain online activities.

  • It also added that nearly 80 per cent of internet users said that it should be mandatory to verify their age when accessing pornography online, especially on dedicated adult sites.
  • Ofcom said that they included a sample of UK internet users aged 18 or above and participants were selected based on whether or not they had accessed pornographic content online at any point in the past.

Ofcom’s warning: The regulator stated that adult sites will need to have a “clear roadmap to implementing robust age verification measures” in the next year. They will be exposed to enforcement action if they are not able to meet certain thresholds.

FREE READ of the day by MediaNama: Click here to sign-up for our free-read of the day newsletter delivered daily before 9 AM in your inbox.

Why it matters: Ofcom claimed that the report is a first of its kind in the world which offers an insight on what video-sharing platforms in the UK are doing to protect users. Moreover, it highlights lacunae which allow kids to access adult sites and what can be done to address these issues.

You can read the entire report here.

Key highlights from the report

The scope of the report runs from October 2021 to October 2022. Here are some of its findings:

Efficacy of safety measures: The report said that all platforms had safety measures in place, including rules on what kinds of video material are allowed. However, it added that VSPs provided limited evidence on the efficacy of their measures because of which it was difficult for Ofcom to ascertain whether these measures are working consistently and effectively.

Prioritise risk assessment: Ofcom said that platforms were not prioritising risk assessment processes, but they believe they are fundamental to identifying and mitigating risks to user safety proactively. It said that it found most platforms did not understand what risk assessments are and how to perform them. They added that the task will be a requirement on all regulated services once the UK enforces the Online Safety Bill.

Reducing the risk of child sexual abuse material (CSAM): The report observed that self-generated content is an increasingly significant driver of child abuse images and videos. It said that the adult sites reported that they have some CSAM prevention measures in place. They said that these platforms have user rules in place to prohibit uploading illegal material, including CSAM. They contain clear sanctions for users who breach them.

Tackling hate and terror: Ofcom said that its survey found that 24 percent of users said they had come across videos they perceived to be violent, abusive, or inappropriate videos between August and October.

  • It said that terms and conditions are adequate but there was some room for improvement as to what can be prohibited in their terms and conditions and how they communicate these rules to users.
  • They added that some terms and conditions were dense and hard to follow and, in some cases, users do not ever need to open the terms and conditions in order to watch content.

Protections for kids below 18: The report said that users in this age group were significantly more likely than adults to say they had been exposed to content they perceived to be harmful online, including negative body image and eating disorder content, content glamourising unhealthy or abusive lifestyles, and the promotion of self-harm. It said that more work needs to be done on providing age-appropriate experiences to users under 18.

Age verification on adult sites: The regulator said that the platform, OnlyFans, has adopted an age verification solution but smaller adult sites had a “long way to go”. It said that the information received from the smaller platforms did not instill confidence about their access control measures to prevent children accessing pornography.

Reporting and flagging: Ofcom observed that reporting and flagging is in place on all platforms but its use and application varies, and it added that there is variation in how integral they are to platforms’ detection and enforcement processes. Additionally, most large VSPs have some form of external ‘trusted flagger’ programme where platforms partner with organisations (such as civil society, government agencies, or other relevant groups) who have a specific expertise in online harm. These ‘trusted flaggers’ can directly flag potentially violating content to the platforms, which then prioritise reviewing those reports.

International collaboration: The regulator wrote that international collaboration was “central” to its future ambitions. They said that there was value in working together with other regulators to highlight best practice, identify common risks, and support platforms. They said that Ofcom was in touch with regulators in France, Germany and Cyprus over their shared regulatory approaches to age verification.

What were some of the specific findings on smaller adult sites?

Ofcom wrote that a platform informed them that it had considered implementing robust access control measures to prevent users from under 18 from accessing pornographic material, but had decided not to as they believed that adding further restrictions would impede adults from accessing the platform and reduce the profitability of the business.

Decision-making structure: The report said that the decision-making processes within these small platforms appear to be relatively informal. It added a caveat that Ofcom does not have enough detail to understand how the platforms assess practicability and proportionality when deciding which protection measures need to be implemented.

What will be the priorities: Ofcom said that it sought to ensure platforms have sufficient processes in place for setting and revising comprehensive user policies that cover all relevant harms. It also added—

  • Review the tools platforms provide to users to control their experience and promote greater engagement with these measures,
  • Promote implementation of robust age assurance, to protect children from the most harmful online content (including pornography).

This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Also read:

Written By

I cover several beats such as Crypto, Telecom, and OTT at MediaNama. I can be found loitering at my local theatre when I am off work consuming movies by the dozen.

Free Reads


Starting now, YouTube videos may buffer or be unavailable if you're using ad-blocking apps.


During the antitrust lawsuit hearing, the jury had concluded that Google maintained an illegal monopoly in the Android app distribution market and the Android...


This development comes after the US Securities and Exchange Commission (SEC) approved Bitcoin ETFs earlier this year.

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...


Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...


The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...


Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...


Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ