wordpress blog stats
Connect with us

Hi, what are you looking for?

What’s wrong with current online content regulations and how can they be improved: UN

The UN OHCHR official shared her views on freedom of speech, social media companies, and online content regulations around the world including India’s IT Rules. 

“State regulation, if done precipitously, sloppily, or with ill intent, can easily consolidate undemocratic and discriminatory approaches that limit free speech, suppress dissent, and undermine a variety of other rights,” Peggy Hicks of the United Nations (UN) Human Rights Office said in a press conference on July 14.

Here are the UN’s views on what is wrong with the current regulations on online content and how can better regulations be made.

What are the current challenges to freedom of speech online?

Unwelcoming and unsafe: “We have the same rights online as offline. But when we look at the online landscape and we see a digital world that is unwelcoming and frequently unsafe for people trying to exercise their rights,” Hicks said.

Reach and speed: While the offline space has also faced similar challenges, “what is new is the reach and speed of this digital public square,” Hicks said.

Situation made worse by governments and companies: Notwithstanding the challenges that we already face online, responses from governments and social media companies “risk making the situation worse” Hicks said, citing examples of countries like India, Nigeria, the UK, the US, and Vietnam. “Discussions on how to address lawful but awful speech online tend to devolve into finger-pointing between states and companies with political and economic interests often eclipsing public interests,” she added.

Why are current regulations making the situation worse?

“There are many examples of problematic legislation on online content. To give you an idea, about 40 new laws relating to social media have been adopted worldwide in just the past two years, and another 30 are under consideration. Virtually every country that has adopted laws related to online content has jeopardized human rights in doing so. This happens both because governments respond to public pressure by rushing in with simple solutions for complex problems and also because some governments see this legislation as a way to limit speech they dislike and even to silence civil society or other critics.” – Hicks (emphasis ours)

Overbroad and ill-defined laws: “In June, Vietnam adopted a new social media code that prohibits posts, for example, that ‘affect the interests of the state’. Laws in Australia, Bangladesh, Singapore, and many other locations include the overbroad and ill-defined language of this sort. And the list keeps growing,” Hicks added.

Rushed legislation: “The United Kingdom in May tabled its draft online safety bill, which has a worryingly overbroad standard that makes the removal of significant amounts of protected speech,” Hicks said. “In the wake of abhorrent abuse of black English football players earlier this week, there are demands to get that legislation into place more quickly, as if the bill could have somehow protected the players from the racism they faced,” Hicks added.

Views on India’s IT Rules

India’s new IT Rules, which were notified on 25 February 2021, contain sweeping regulations that apply to social media platforms, online streaming platforms, and digital news media.

“This new law introduced some useful obligations for companies relating to transparency and redress, but a number of provisions raise significant concerns, including those empowering non-judicial authorities to request quick takedowns, obliging platforms to identify originators of messages, and stipulating that companies must appoint local representatives whose potential liability could threaten the ability to protect speech and even to operate.” – Marcelo Daher

“The threat of limiting protected speech and privacy has already surfaced there [India], including legal disputes with both Twitter and WhatsApp in the past month, which are now before the courts,” Daher added.

In June, in a letter to the Indian government, the UN Special Rapporteurs on freedom of expression, privacy, and right of the peaceful assembly said that the recently notified IT Rules 2021 “do not appear to meet the requirements of international law and standards related to the rights to privacy and to freedom of opinion and expression.”

What problems do most regulations suffer from?

According to the UN Human Rights Office, most regulations to regulate online content suffer from many of the same problems, namely:

  1. Poor definitions on what constitutes unlawful or harmful content
  2. Outsourcing of regulatory functions to companies
  3. Overemphasis on content takedowns and the imposition of unrealistic time frames
  4. Excessive powers granted to state officials to remove content without judicial oversight
  5. Overreliance on algorithms and artificial intelligence

How can better regulations be made?

“We have one overarching message we’d like to bring to this debate, and that is the critical importance of adopting human rights-based approaches to confronting these challenges. It is, of course, the only internationally agreed framework that allows us to do that effectively.” – Peggy Hicks

UN human rights officer Marcelo Daher outlined five ways governments can make better regulations:

  1. Focus on process, not content: “Look at how content is being amplified or restricted. Ensure actual people, not algorithms review content decisions.”
  2. Laws should be narrowly tailored: “Ensure content-based restrictions are based on laws clear and narrowly tailored and are necessary, proportionate and non-discriminatory. To be proportionate, restrictions should be the least intrusive methods available.”
  3. Be transparent: “Companies should be transparent about how they operate and moderate content and how they share information with others. States also should be transparent about their requests to take down content or to access users’ data.”
  4. Give users opportunities to appeal: “Ensure users have effective opportunities to appeal against decisions they consider to be unfair and make good remedies available for when actions by companies or states undermine their rights. Independent courts should have the final say over the lawfulness of content.”
  5. Involve various stakeholders: “Make sure civil society and experts are involved in designing and evaluation of all regulations. Participation is essential.”

Is there an ideal regulation that can be replicated?

One of the questions posed to the UN Office of Human Rights was if there is a model of legislation that the UN would like to see replicated. “That’s one of the things that we’re really looking for,” Hicks said. “To try to get some of the states that are legislating right now to really engage in a serious, thoughtful process, to bring in and consult with those that have expertise on these issues, to learn from the examples that we’re citing here and many others to really make a better law,” Hicks added.

Hicks added that there are signs of such legislation coming from the European Union with the Digital Services Act, but admitted that there’s still some work to be done with it. One of the issues with the Digital Services Act appears to be the fact that law enforcement agencies will be able to remove content by bypassing some of the procedural standards and judicial review that the UN thinks is really necessary, she said.

What should social media companies do?

Be more transparent: “Social media companies have become something of a punching bag for everything that goes wrong. They are harshly criticized both for failing to take down harmful content and often equally severe abuse when they actually do so,” Marcelo said.  But “much of this criticism is justified” because “companies open themselves up for such complaints by their ill-defined and opaque policies and processes,” he added. “Companies need to do much more to be transparent and actively share information about their actions and company policies and processes,” Hicks added.

Actions should be proportionate to risks: “The actions companies take should be proportionate to the severity of the risk. Their options include a range of measures, not just takedowns, but flagging content, limiting amplification and attaching warning labels,” Hicks said.

Think about how to address issues globally: “Companies also need to grapple with how they address content moderation issues globally. Context is essential to understanding the potential of speech to incite violence,” Hicks said.

Also Read

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

By Rahul Rai and Shruti Aji Murali A little less than a year since their release, the Consumer Protection (E-commerce) Rules, 2020 is being amended....

News

By Anand Venkatanarayanan                         There has been enough commentary about the Indian IT...

News

By Rahul Rai and Shruti Aji Murali The Indian antitrust regulator, the Competition Commission of India (CCI) has a little more than a decade...

News

By Stella Joseph, Prakhil Mishra, and Surabhi Prabhudesai The recent difference of opinions between the Government and Twitter brings to fore the increasing scrutiny...

News

This article is being posted here courtesy of The Wire, where it was originally published on June 17.  By Saksham Singh The St Petersburg paradox,...

You May Also Like

News

Developed in India, Koo was the first social media intermediary to say that it would comply with the new IT rules. Koo, a homegrown microblogging...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ