Meta has partnered with Hyderabad-based fact-checker NewsMeter to combat misinformation in regional languages. Enhanced third-party fact-checking services will now be available for regional languages including Tamizh, Malayalam, Kannada, and Telugu, as per a press release from earlier this week. Meta noted that this collaboration marks its 11th fact-checking partnership in India. India, its largest market, now has the ‘most third-party fact-checking partners globally across Meta.’
Why it matters: This announcement comes barely a few weeks after Meta’s first Human Rights Report was released—enforcing content moderation in regional languages was cited as a particular challenge to its India operations. It further noted ‘the potential for Meta’s platforms [in India] to be connected to salient human rights risks caused by third parties, including: restrictions of freedom of expression and information; third party advocacy of hatred that incites hostility, discrimination, or violence; rights to non-discrimination; as well as violations of rights to privacy and security of person.’ These efforts, which come off the back of widespread criticism of Meta’s content moderation efforts in India, may mark an attempt to tackle such violations fueled by harmful speech.
Meta has also expanded its fact-checking services in other languages. Aside from the 11 languages already monitored, existing third-party fact-checkers will now assess content in Kashmiri, Nepali, Bhojpuri, and Oriya. The tech giant has also partnered with the Internet and Mobile Association of India to fund a fact-checking fellowship exclusively for Indian media houses.
Currently, whenever a fact-checker rates content as ‘false, altered or partly false’, Meta reduces its distribution and visibility. The people who shared the content are notified of this decision while warning labels are added to the debated information.
Reports from last year suggest that Meta (previously Facebook) has struggled to fact-check all contested information on its Indian platforms due to resource crunches and a lack of cultural sensitivity to regional languages. That being said, fact-checkers have stated that they don’t actually know how the company acts once they flag content as ‘fake’.
Others have suggested that the platform takes a relaxed approach to organised disinformation on the platform, especially when it is propagated by powerful political parties. Facebook’s inaction on harmful content posted by politicians affiliated with the ruling Bhartiya Janata Party has been repeatedly raised by Facebook whistleblowers Frances Haugen and Sophie Zhang.
Despite these allegations, as per Meta’s Human Rights Report, the independent assessors tracking its impacts on human rights in India ‘did not assess or reach conclusions about’ allegations of biases in content moderation. The report was widely criticised for glossing over Meta’s impacts on human rights in India. Some commentators allege that Meta did not ‘allow’ independent investigators compiling the assessment to speak to Haugen on its operations in India. Others argue that as Meta has done in other jurisdictions, the full text of the human rights impact assessment should have been published—not a generalised summary.
Read More
- Meta’s Independent India Human Rights Impact Assessment Summarised In New Report
- What Facebook Didn’t Do Against A BJP MP’s Activities
- Summary: New Report On How Platforms Should Handle Disinformation
I'm interested in stories that explore how countries use the law to govern technology—and what this tells us about how they perceive tech and its impacts on society. To chat, for feedback, or to leave a tip: aarathi@medianama.com
