ShareChat has banned 50,000 profiles to keep pornographic, violent and fake content off of its platform, reports the Economic Times. A ShareChat spokesperson told MediaNama that the platform ran a campaign 6 months ago asking users to flag/report problematic content. However, the company refused to disclose which languages the 50,000 users were spread across, or how many accounts were actually flagged by users.

  • The spokesperson said that a majority of the fake content is filtered out by a machine learning algorithm.
  • Additionally, if an accounts repeatedly violates ShareChat’s community guidelines, the accounts and even the device may be blocked by ShareChat’s content moderation team.
  • Once a device is banned, the user would be unable to create a new account on ShareChat from the same device.
  • ShareChat CEO Fahid Ansar told ET that the company cooperates with law enforcement agencies on legitimate requests for user information.

ShareChat is an Indian language messaging and social networking app, and currently supports 14 languages including Hindi, Tamil, Rajasthani, Bengali, Assamese and Marathi. ShareChat claims to have over 35 million monthly active users.

In September last yeat, it raised Rs 720 crore from Shunwei Capital and existing investors Morningside Ventures of Hong Kong, Jesmond Global, Xiaomi, SAIF Partners and Lightspeed Venture Partners.

Changes to online safe harbor rules

The crackdown on fake content comes just as the government is looking to amend the IT Rules governing Section 79, which cover intermediary liability, of the Information Technology Act. The section provides safe harbor to platforms including all payments providers, e-commerce marketplaces, ISPs, content or messaging services like ShareChat etc. Proposed amendments to the IT Rules would mean that intermediaries may be held more liable for the actions/behavior of their users.

Under the new rules, ShareChat would have to:

  1. Pull down unlawful content within 26 hours instead of the earlier timeframe of 36 hours, and keep records of the “unlawful activity” for 180 days – double the period of 90 days in the 2011 rules.
  2. Provide traceability and information within 72 hours: Additionally, the rules require the platforms to hand over information or assistance to government bodies in 72 hours
  3. Registers under the Companies Act: Platforms with over 50 lakh users are required to be registered under the Companies Act, have a physical address in the country, have a nodal officer who will cooperate with law enforcement agencies, etc.
  4. Inform users of the platforms’ right to terminate usage rights and to remove non-compliant content at their own discretion.

Sneha adds: Online fake news is a large platform-and-information problem not just in India, but also globally. Platforms have been grappling with a growing audience, the diversity of languages as the audience grows, and an uptick in mobile adoption especially in developing countries where users are mobile first. While the evolution of the internet has shifted to making everything available on the phone, these first time internet users have not spent enough time on the internet to differentiate between original, or in this case, non-fake/fact based information, and fake news/information. And this is going to be problematic (it already is) when more and more people log on to the internet for the first time.

Read more here:

Our fake news and content regulation (1, 2) coverage.