The UN’s Special Rapporteur for Human Rights has put out a report on online content regulation. In it, David Kaye, the Rapporteur, recommended that human rights standards should be the foundation of all content regulation activities by governments and private companies that operate social media and other public platforms.
The Special Rapporteur is an independent expert who is appointed by the UN’s Office for the Human Rights Commissioner.
The report says that companies need to be highly transparent about ‘content actions’, where they suspend users, delete posts, or disclose user data in response to government requests. “Given their impact on the public sphere, companies must open themselves up to public accountability,” the report says.
On government laws
The report warns against “broadly worded restrictive laws” against content featuring defamation, false news, propaganda and extremism. Such broad legislation can be used by the government to suppress legitimate discourse, the report said.
The Indian government’s track record of censorship in films and online content illustrates the point Kaye makes. From the now-discontinued 66-A to the IT Act, broad legislation allows the government to regulate and censor, at its discretion, a wide variety of content. This is not a conducive environment for freedom of expression, the report said.
Additionally, fines on companies that fail to interpret these laws to act independently to remove content on their own is also a hindrance to free speech, the report says. “Such rules [that require companies to act to remove content] involve risks to freedom of expression, putting significant pressure on companies such that they may remove lawful content in a broad effort to avoid liability.” The report cites China’s Cybersecurity Law as an example.
On governments acting outside the law
“State authorities increasingly seek content removals outside of legal process or even through terms of service requests,” the report pointed out. The report cited the EU’s Internet Referral Unit, which flags content on social media that is deemed to be extremist or terrorist-related.
Another example was of Pakistan’s three year ban on Facebook, which the report described as a ‘non-binding effort to accelerate content removals’. After the ban, YouTube worked on creating a version of the site that was friendlier to the Pakistani government’s content removal requests.
The bulk of removals on social media sites is because of violations of those sites’ terms of service. As the report pointed out, however, “[…] companies do not consistently disclose sufficient information about how they respond to government requests, nor do they regularly report government requests made under terms of service.”
Since most companies do not base their community standards on any speech laws or national legislation, the report said, there is a lot more uncertainty around those removals.
This has led to documented discrimination against minorities, the report said. “Users and civil society report violence and abuse against women, including physical threats, misogynist comments, the posting of non-consensual or fake intimate images and doxing; threats of harm against the politically disenfranchised, minority races and castes and ethnic groups suffering from violent persecution; and abuse directed at refugees, migrants and asylum seekers.”
Lack of transparency
The report noted that aside from government requests, few ‘content actions’ get the transparent disclosure, both to the community overall and to users affected. If a platform has restrictions, the bases should be transparently disclosed, the report urged.
The report cited a Ranking Digital Rights report, pointing out that “Companies disclose the least amount of information about how private rules and mechanisms for self- and co-regulation are formulated and carried out.”
Dealing with fake news and disinformation through ‘blunt’ means such as website blocking and removals, pose a ‘serious interference with freedom of expression’ risk, the report warned.
“Some measures […] may threaten independent and alternative news sources or satirical content,” the report said. It concluded, “Government authorities have taken positions that may reflect outsized expectations about technology’s power to solve such problems alone.”
For countries (“states”)
— Repeal all laws that unduly threaten free speech, online or offline.
— Regulate intelligently rather than heavy-handedly.
— Don’t write laws that encourage proactive monitoring or filtering of content, which can lead to pre-censorship.
— Publish transparency reports on all content-related requests.
— Don’t delegate responsibility of adjudicating content to companies.
— Human rights are the main basis for ensuring freedom of expression, not country laws and private interests.
— Be radically transparent compared to the present situation.
— Open yourselves up to public accountability