There was a drop in the number of WhatsApp accounts banned in February 2022 (1,426,000) as compared to January 2021 (1,858,000), according to the platform’s compliance report. This is the third time that WhatsApp has banned fewer than two million accounts since July 2021 when it first started publishing periodic compliance reports in accordance with the IT Rules, 2021.
These accounts were banned based on WhatsApp’s detection of ‘abusive’ accounts with the help of digital tools. WhatsApp said that it checks for abuse during account registration, messaging, and in response to ‘negative feedback’.
WhatsApp is frequently used to spread misinformation in India. These reports provide an insight into the magnitude of user complaints. They also shed light on action taken by the platform to address these complaints; thereby, bringing its content moderation practices into focus.
Decrease in WhatsApp user grievances
The user grievances received by WhatsApp declined from 495 grievances in January to 335 grievances in February. These grievances pertain to ban appeals, product support, and more. The number of ban appeals decreased from 285 in January to 194 in February. Here’s how WhatsApp dealt with other user grievances received.
User grievances related to product support and safety have ‘NA’ written against them as these grievances are redirected to the app’s in-built reporting feature, and actions taken therein are not included in the “Accounts Actioned”, WhatsApp explained in the report.
The platform marks ‘NA’ in cases where:
- The user requires assistance from WhatsApp to access their account.
- The user requires assistance to use one of WhatsApp’s features.
- The user is writing to WhatsApp to provide feedback regarding its service.
- The user requests restoration of a banned account and the request is denied.
- The reported account does not violate the laws of India or our Terms of Service.
What the IT Rules 2021 require
WhatsApp publishes these reports in accordance with Rule 4(1)(d) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
The IT Rules require social media intermediaries to:
- Publish periodic compliance reports: These reports should be published every month and have details of complaints received, action taken, and “other relevant information”.
- Appoint key managerial roles: Significant social media intermediaries (with more than 50 lakh registered users) must appoint a chief compliance officer, nodal contact person, and resident grievance officer, all of whom must be Indian residents and employees of the platform.
- Proactively identify and take down content: This includes content moderation (through automated mechanisms) of posts that are defamatory, obscene, pornographic, paedophilic, invasive of privacy, insulting or harassing on gender, and other types.
- Disable content within 36 hours of government order: The Rules also ask intermediaries to provide information for verification of identity or assist any government agency for crime prevention and investigations no later than 72 hours of receiving a lawful order. They also have to preserve records of disabled content for 180 days
Also Read:
- Google continues to witness a rise in number of user complaints in January 2022, compliance report shows
- Twitter took action against 404 URLs for illegal activities, between November and December
- Facebook took action against significantly lesser amount of hate speech in December, compliance report shows
Have something to add? Post your comment and gift someone a MediaNama subscription.
Among other subjects, I cover the increasing usage of emerging technologies, especially for surveillance in India
