In a blog post, Google has offered ideas for approaching oversight of content-sharing platforms, which it calls a “shared responsibility” of governments, tech platforms and civil society. The company presented the following ideas to keep in mind while formulating laws that oversee content-sharing platforms:

Governments must draw clear lines between legal and illegal speech

  • While content sharing platforms are working to develop and enforce responsible content policies, the onus is on governments to “draw clear lines” between legal and illegal speech.
  • These lines should be drawn based on evidence and should be consistent with democratic accountability and international human rights.
  • Without clear definitions, there is a risk of arbitrary or opaque enforcement that limits access to legitimate information.

Different approaches for different platforms and types of content 

  • Different online services have different purposes and laws overseeing online content should keep this in mind.
  • Rules that make sense for content-sharing platforms may not be appropriate for search engines, enterprise services, file storage, communication tools, or other online services, where users have fundamentally different expectations and applications.
  • Different types of content may likewise call for different approaches.

Accountability through transparency

  • Meaningful transparency promotes accountability, and can promote best practices, facilitate research, and encourage innovation without enabling abuse of processes.

Flexible policies that recognise varying needs and capabilities 

  • While Google and others have “pushed the boundaries of computer science in identifying and removing problematic content at scale”, these advances require flexible legal frameworks, not static or one-size-fits-all mandates.
  • Likewise, legal approaches should recognise the varying needs and capabilities of startups and smaller companies.

Focus on overall results, not anecdotes 

  • The scope and complexity of modern platforms needs a data-driven approach that focuses on overall results rather than anecdotes.
  • While it is impossible to eliminate all problematic content, progress in making that content less prominent should be recognised. Reviews under the European Union’s codes on hate speech and disinformation offer a useful example of assessing overall progress against a complex set of goals.

Countries shouldn’t be able to impose their content restrictions on others

  • While there is broad global consensus on issues such as child sexual abuse imagery, in other areas individual countries will make their own choices about the limits of permissible speech, and one country should not be able to impose its content restrictions on another.

Read: The Future of Internet policy in India – 2019

Read: A serious and imminent threat to the open Internet in India