wordpress blog stats
Connect with us

Hi, what are you looking for?

IT Rules 2021: How live streaming content on social media will be impacted

Social media companies with more than 5 million users will potentially have to proactively monitor live, or real-time content such as YouTube or Facebook live under India’s new social media rules. This might lead to platforms over-censoring content, would be difficult to get right due to limitations in current automated detection tools, and involves major technical costs for smaller companies, legal and technical experts MediaNama spoke to concurred.

“The rule is very badly drafted,” said Divij Joshi, an independent lawyer, researcher, and tech policy fellow at Mozilla. “The rules don’t make a distinction between ‘live’ and non-live media. It will apply to real-time uploads on a textual reading, but in practice it can be more difficult to monitor and takedown infringing/ illegal content as it is happening.”

Why this matters: Content that the rules treat as problematic could be streamed live on platforms like YouTube, and Facebook. For instance, in 2019, shootings at different mosques in New Zealand’s Christchurch was streamed live on Facebook, and the disturbing visuals even remained on the platform for around an hour before being taken down.

A bit of context: The rules, among other things, require that social media companies with more than 5 million registered users, “shall endeavour” to deploy technology-based measures such as automated tools to proactively identify information that depicts rape, child sexual abuse material (CSAM), or any information that that is “exactly identical” to information that was previously removed or access to which was disabled. Content taken down through such tools would have to be flagged for users trying to access it later. Also:

  • Proportionate to free speech: The measures taken by intermediaries would have to be proportionate with regard to the right to free speech and expression and user privacy.
  • Measures will need human oversight: The measures implemented by intermediaries will need to have human oversight , including a periodic review of any automated tools deployed as part of them.
  • Automated tools should tackle issue of bias: The review of the automated tools would evaluate their accuracy and fairness, propensity for bias and discrimination, and their impact on privacy and security.

Google India said that its teams were still reviewing the details outlined in the new rules. Facebook didn’t respond to our queries until publication.

Potential of over-censoring

The rules not only mandate automated takedowns for content like child sexual abuse material—which is relatively uncontroversial—but more importantly, have a ‘stay down’ requirement for speech which has been deemed previously illegal, said Joshi. This could be dangerous since it can “potentially require the censorship of many kinds of media and content which was deemed illegal in one context but many not be illegal in another,” he added.

Advertisement. Scroll to continue reading.

Automated tools have difficulty in understanding the context of certain speech, and as a result could end up impacting legitimate speech, Shashank Mohan, project manager at the Centre Communication Governance, NLU-Delhi told MediaNama. “Since automated filters are known to have an error rate in the identification of specific content, specially in circumstances where there maybe varying context, this requirement could impact legitimate speech such as sharing of violent or terrorist content for news reporting purposes,” he added.

“Algorithmic technologies which censor media always pose the risk of taking down legal and legitimate content. This is because algorithms, no matter how sophisticated, cannot comprehend speech in the way in which humans do, which is heavily context-specific. The word ‘Fire’ can be illegal content in one scenario (where it causes a stampede in a crowded theatre) or legal content in another – an algorithm cannot tell the difference.” — Divij Joshi, independent lawyer, researcher, and tech policy fellow at Mozilla

The burden to justify why a certain content takedown was incorrect, will fall on the users whose content has been censored, Mohan said. “Although the sub-rule does provide for respect of free speech and privacy of users, the burden will fall upon everyday users to justify that a particular take down vioates their rights to free speech. This may be even more burdensome in cases where social media intermediaries have taken down content based on government orders,” he added.

‘Could cause delays in live streams’

“As of now the tools to do real-time content moderation haven’t reached a very high degree of accuracy and still will leave some edge cases,” Vikas Malpani, co-founder and CEO of Leher app, told MediaNama. Leher is an app similar to Clubhouse, but also allows users to have video-based conversations.

He said that deploying automated tools for monitoring live or real-time content will have many issues. The process, Malpani said, will be cost intensive, cause probable delays in the live stream creating a bad user experience, may not be accurate in handling all types of imagery, and will be poor at handling multiple languages for which the tool hasn’t been fully trained for. The tool could also throw up a lot of false positives, which will create a load on the company to also have an army of human moderators, Malpani added.

“The proactive monitoring requirement appears to essentially require the implementation of watermarking and fingerprinting technologies – which will create a ‘fingerprint’ of illegal content in a database and take down all content or media within a range of similarity to that fingerprint. For live content, the major difficulty is in the time taken for this technology to either flag, or automatically remove, any content which matches with the fingerprint. Additionally, minor changes in the content can help it escape these fingerprinting technologies.” — Divij Joshi

‘Let communities self moderate’: Instead of relying on automated tools, Malpani said that users on a platform should be allowed to self-moderate. To deal will the issue of inappropriate content, platforms could enable real-time reporting by users, or a reputation-based system within the platform that will limit the reach of certain people on the platform. to people who are explicitly shared the stream. Users who wish to stream live content could also be asked to prove their identity before going live, he added.

Also read: 

Advertisement. Scroll to continue reading.

Written By

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



Releasing the policy is akin to putting the proverbial 'cart before the horse'.


The industry's growth is being weighed down by taxation and legal uncertainty.


Due to the scale of regulatory and technical challenges, transparency reporting under the IT Rules has gotten off to a rocky start.


Here are possible reasons why Indians are not generating significant IAP revenues despite our download share crossing 30%.


This article addresses the legal and practical ambiguities in understanding the complex crypto ecosystem in India.

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ