We missed this earlier.

Facebook said that it is looking to reduce the amount of sensational or misleading health-related content and posts promoting products or services based on such posts on users’ newsfeeds in its latest content ranking algorithm update rolled out last month. Facebook’s announcement came after a Wall Street Journal report (paywall) highlighted how the social media site and YouTube were awash with content containing potentially harmful information about alternative cancer treatments. WSJ found out that a supplement salesman Robert O. Young whose Facebook page has more than 60,000 likes and followers, suggested in a post that cancer could be cured with baking soda injections and certain liquid concoctions. It also noted that Young was convicted in a San Diego County court in 2016 for practicing medicine without a license. The report noted that in July 2012, Naima Houder Mohamed who was a cancer patient met Young after having seen his videos on YouTube, but died of metastatic breast carcinoma in November 2012 after Young’s treatment that costed $ 70,000. WSJ also found that such widespread misinformation sometimes appeared alongside advertisements, videos or pages for proven treatments.

How the update works and how it will affect page reach

Facebook’s updated algorithm will determine reducing reach of such posts by identifying commonly used phrases and predicting which posts might include sensational health claims or promotion of products with health-related claims. These posts will then be shown lower in a user’s news feed. Note that Facebook has other options here, apart from showing posts lower in someones feed: it can also not show the posts to users, or reduce the reach of such posts by showing them to fewer users.

The firm said that most pages won’t experience any significant changes to their reach. However, pages that have posts with sensational heath claims “will have reduced distribution”. But once these pages stop posting such content, their posts will no longer be affected by this update. Facebook hasn’t defined by when the pages will get back to normal distribution, once these pages stop posting such content.

Facebook’s constant algorithm updates

Earlier this month, Facebook said that it’ll ban ads which discourage people from voting during the 2020 US presidential elections. Announced as part of its second civil rights audit, bans against “Don’t Vote” ads, the ban will come into effect before the 2019 gubernatorial elections, which is scheduled for November.

Then in June, Facebook announced changes that limit the spread of messages in Sri Lanka and Myanmar, where it has come under fire in recent years. It said it was “adding friction” to message forwarding for Messenger users in Sri Lanka so that people could only share a particular message a certain number of times. The limit is currently set to five people, an image (see below) included in the blog post suggests. The change is similar to the one Facebook made to WhatsApp earlier this year to reduce forwarded messages around the world. Facebook said the change “also delivers on user feedback that most people don’t want to receive chain messages”.

The same month, Facebook updated its policy for ranking comments on public posts; it will now start showing a comment more prominently when it has interactions from the Page person who originally posted it, or has comments and reactions from  friends of the original poster.  Facebook ranks public comments for Pages and people with large followings, so that only “relevant and quality” comments show up.