wordpress blog stats
Connect with us

Hi, what are you looking for?

How Should Safe Harbour Laws Change to Regulate Platforms and Protect Users on Today’s Internet? #NAMA

Panelists at MarketsNama discussed refining safe harbour laws for platform accountability and consumer safety

With inputs from Vallari Sanzgiri

“What we’ve been trying to do in India is hang the sword of taking away safe harbour [over platform’s heads],” argued The Quantum Hub’s Rohit Kumar at the “Safe Harbour v2: What Should It Look Like?” panel at MediaNama’s recent MarketsNama conference. “That is too strong a penalty and it opens up the platform to all sorts of harm and losses that might put investments at risk and cause massive damage to the ecosystem.” 

Kumar’s comments come after the IT Minister Rajeev Chandrasekhar’s repeated calls to nix safe harbour protections from upcoming IT laws. Remember: safe harbour prevents platforms from being held liable for the third-party content they host, a protection that Chandrasekhar believes has led to “toxicity” on the Indian Internet. 


STAY ON TOP OF TECH POLICY: Our daily newsletter with top stories from MediaNama and around the world, delivered to your inbox before 9 AM. Click here to sign up today! 


However, without safe harbour, the innovation on the Internet could slowly grind to a halt—which is why Kumar suggested that safe harbour regimes be refined to actually support consumer safety online. 

“Safe harbour is very important in terms of protecting people from platforms, [and] from actions that others take using a platform. But, it needs to come with obligations that force platforms to set up systems to make sure accountability improves,” argued Kumar. “For instance, there’s transparency, grievance processes, time-bound responses [to complaints]. Safe harbour with certain obligations that force companies to change the way they interact with consumer communities is essential. This, in itself, doesn’t necessarily stop innovation or shapeshifting [by platforms].”

MediaNama hosted this discussion with support from Salesforce, Google and Mozilla. Internet Freedom Foundation, and our community partners, the Centre for Internet and Society and Alliance of Digital India Foundation.

Is platform liability for content the most effective way to ensure platform accountability?: To answer that, we first need to remember that a platform’s liability for third-party content is a post facto determination made in Court, explained Vasudev Devadasan, of the National Law University Delhi’s Centre for Communication Governance. This is what safe harbour regulates. “[That is] What are the conditions that prevent you from being held liable in Court?,” continued Devadasan. “[However] I don’t think safe harbour and liability, that is suing platforms for content and setting conditions on which they are entitled to safe harbour, is a great way of ensuring [accountable] systems design processes and enforcement. One, enforcement only works when people sue. It’s not systematic and only works against the platform that’s being sued. It’s ad hoc and fundamentally based on the enforcement mechanism, which is lawsuits.”

Safe harbour laws should focus on protecting consumers: “It’s my firm belief that safe harbour cannot be a free pass to platforms,” said Snap’s Uthara Ganesh. “Second, I think that [a safe harbour] regulation, instead of being prescriptive and focusing on how something should be achieved, should really ask platforms to have robust systems and processes [in place] that ensure that they are protecting their responsibilities … Regulations should ask for that without prescribing how to do it.”

Why only link platform accountability to safe harbour?: Platform obligations under India’s platform regulation rules—which have to be followed to retain safe harbour—can instead be distinct positive duties on platforms and separated from the safe harbour regime, Devadasan suggested.

Take the example of publishing a transparency report under this model as a statutory obligation, Devadasan suggested. “If you don’t publish one, you could pay a one-off fine, and that could be determined as a civil penalty,” he said. “[For the platform] You know what the fine is upfront. This is opposed to the risk of [losing] safe harbour [for illegal content]. Take [the example of] neo-Nazi content—you could be sued for millions of dollars, there’s a lot of regulatory uncertainty, and you have millions of pieces of content on your platform [to scan]. This is very different from a statutory obligation of publishing a transparency report. This [distinguishing between positive duties and safe harbour obligations] allows for regulatory certainty, and it doesn’t take away your safe harbour or diminish your safety. [This way] If you are sued for neo-Nazi content, you still have your safe harbour and you can strengthen that.

Currently, safe harbour puts costs of non-compliance on aggrieved: One of the reasons why most of the panellists want these positive obligations is because they recognise that the safe harbour construct currently puts the cost on the aggrieved party if something goes wrong, observed Kumar. 

“By creating this environment where you [as a platform] are taking actions that prevent that [user] harm from happening, you are actually making it easier for people who may not have the resources or may not have the wherewithal to hold platforms accountable [to function online],” he concluded. This makes the ecosystem safer for them.

Positive obligations can lead to double penalisation of platforms: An audience member noted that having positive obligations, alongside a safe harbour regime, can lead to platforms being doubly penalised. “Trying to create risk itself [when running a platform] has itself become a harm,” they argued. 

Devadasan rebutted, arguing that in today’s world, platforms causing harm can’t necessarily be regulated strictly through liability and safe harbour clauses. “Take something like a recommendation system that privileges inflammatory content,” he suggested. “You’d have to conduct an empirical survey to determine if this system is actually pushing inflammatory content or not [to prove liability], which is a very hard thing to do. You’ll probably never be able to prove it, because it’s a systems-based harm. Whereas [this is opposed to] an individual piece of content that you can [actually] sue. So, the liability and harm route isn’t as simple anymore—to solve system problems, you need to impose obligations on [platform] designs, processes, and systems, and that’s why positive obligations work.”

“Look at any other complex, regulated environment,” Devadasan continued. “Would you want a pharmaceutical company to conduct clinical trials to make sure the drug is safe [before release]? Or, would you leave conducting clinical trials only to when the drug is released, somebody is harmed, and they can [potentially] sue you? When you have these complex systems … why wait until something happens and they’re sued, when you can [instead] incentivise them … [through] these positive obligations.”

Robust enforcement required: However, changing safe harbour regulations can be meaningless if they lack robust enforcement and transparency requirements, argued Ganesh. “In spirit and effect, what the law should strive to do is ensure that there is a strong culture of enforcement against this [that is, positive duties],” she continued. “One way of thinking about it is having transparency provision and strong disclosures. These two things would have a powerful effect on ensuring that the trust question is at least breached.”

What positive obligations should you impose on platforms?: The model also raises concerns of which design and process obligations to impose on platforms as positive duties. “At some level, if we are imposing these positive obligations, then we [also] need to start defining who these obligations pertain to,” Devadasan concluded.


This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.

Read more

Written By

Free Reads

News

In its submission, the Interior Ministry said the decision to impose a ban was "made in the interest of upholding national security, maintaining public...

News

Among other things, the security requirements include data encryption and regular review and updated access permissions to reflect personnel changes.

News

the NTIA had earlier sought comments on the risks, benefits, and potential policy related to dual-use foundation models for which the model weights are widely...

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.

Views

News

NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...

News

Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...

News

The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...

News

Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...

News

Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like

News

Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...

Advert

135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...

News

By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

News

Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Name:*
Your email address:*
*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ