The UK government wants to make Ofcom, the communications regulator that is akin to India’s TRAI, the regulator of online harms on the internet. UK Digital Secretary Nicky Morgan and Home Secretary Priti Patel announced this on February 12. Given its “experience of overseeing the broadcasting and telecoms sectors”, the British government said that “it has the expertise and independence needed to take on the challenge of regulating online harms”.
To that end, Ofcom will get new powers through legislation, but “decisions on processes and procedures will be taken by Ofcom”, the Department of Digital, Culture, Media and Sport (DCMS) said. It will, of course, need to protect users’ rights online including “safeguarding free speech, defending the role of the press, promoting tech innovation and ensuring businesses do not face disproportionate burdens”.
The government also published its initial response to the public consultation on the Online Harms White Paper that ran from April 8 to July 1, 2019 and received over 2,400 responses. The government will release the full response in Spring 2020 which will set out more details about the enforcement powers of Ofcom.
Who will be affected?
Only companies that allow sharing of user-generated content, including comments, forums or video sharing. The Department of DCMS says that fewer than 5% of UK businesses will be affected by this. The legislation will be introduced proportionately to minimise regulatory burden on small businesses. Proportionality will be determined by evidence of risk of harm and technical feasibility.
“Most small businesses where there is a lower risk of harm occurring will not have to make disproportionately burdensome changes to their service to be compliant with the proposed regulation.” — UK Government’s initial response to public consultation
Who will not be affected?
- B2B services which “pose low risk to the general public” and provide “virtual infrastructure to businesses for storing and sharing content”
- Companies that just have a social media presence but do not operate a website with UGC functionality
- Companies that share a referral/discount code on social media with potential customer
Responsibilities of Ofcom
- Duty of care: Enforce a statutory duty of care to protect users from terrorist content and of child sexual abuse material (CSAM), and ensure that online companies have the systems and processes to fulfil their duty of care
- Hold companies accountable: Hold companies to account if they do not tackle CSAM and terrorism online. The White Paper had said that the regulator would have an escalating range of powers to take action against companies that do not fulfil their duty of care — notices and warning, fines, business disruption measures, senior manager liability, and ISP blocking in most egregious cases. While most respondents to the consultation agreed with this tiered approach, “industry and rights groups expressed some concerns about the impact of some of the measures on the UK’s attractiveness to the tech sector and on freedom of expression”
- Monitor new and emerging online dangers and take appropriate enforcement reaction
- Different regulation for illegal content/activity and legal but harmful content
- Not investigate/adjudicate individual complaints to ensure freedom of expression; the companies will be responsible for this adjudication of legal content
- Transparent consultation process with stakeholders as Ofcom comes up with regulations
Responsibilities of companies
- Swift removal of illegal content: Ensure quick removal of illegal content, especially CSAM and terrorist content, and minimise the risk of its reappearance. It is not clear if the scale of the company will lead to different expectations.
- Safeguard freedom of expression: Explicitly state what content and behaviour is acceptable on their sites in clear and accessible terms and conditions, and enforce them transparently. The regulations will NOT stop adults from posting legal content that may be considered offensive by some.
- User redress mechanisms: Have effective and proportionate user redress mechanisms so that users can report harmful content and challenge content takedowns
- More detailed transparency reports that include reasons for content removal. Such reporting would also be proportionate to type of service, size of and resources available to company, and risk factors involved.
- Protection of children would be the responsibility of the companies. The government’s response does not specify what kind of “age assurance and age verification technologies” should be used. The government is working on interim codes of practice to tackle CSAM and terrorist content till Ofcom becomes operational as the regulator of online harms.
The government will also publish a transparency report which will be fed by the recently established multi-stakeholder Transparency Working Group which is chaired by the Minister of Digital and Broadband.
What about electoral integrity?
To work on electoral integrity, the government is working on the Defending Democracy programme with the Cabinet Office.