Facebook “in the coming weeks”, will start telling users if they liked, reacted or commented on “harmful misinformation” about COVID-19 that was removed by its moderators, the company said. Users who engaged with such content, will soon see anti-misinformation messages in their news feed, which will connect users with some of COVID-19 related myths busted by the World Health Organisation. In the US, it has also started showing fact-checked articles in its COVID-19 Information Centre. 40 million misinformation posts flagged in March: Facebook also revealed that in March, it displayed warnings on about 40 million posts on Facebook, based on around 4,000 articles by its independent fact-checking partners. It claimed that upon seeing those warnings, people did not go on to view the original content 95% of the time. However, as The Verge pointed out, there is little clarity about normal clickthrough rates on these platforms. Facebook has expanded its fact-checking coverage to “more than a dozen new countries and now work with over 60 fact-checking organisations that review content in more than 50 languages”, the company claimed. On Facebook and Instagram, the company claimed to have directed more than 2 billion people to authoritative health resources via its COVID-19 Information Centre and educational pop-ups, with more than 350 million people clicking through to learn more. Report calls misinformation on Facebook dangerous: Facebook’s updates to its misinformation policy come after a report by Avaaz, "an online activist network", claimed that “misinformation about the coronavirus on Facebook could potentially cost lives and…
