Users can now moderate all platform services recommending potentially sensitive content. On June 6th, Instagram announced updates to its ‘Sensitive Content Control’ feature. The tool can now be used to moderate content on all platform services that make recommendations to users, such as ‘Search, Reels, Accounts You Might Follow, Hashtag Pages and In-Feed Recommendations.’ The update will be available for users within a few weeks. What is Sensitive Content Control? The Explore page largely consists of content recommended to the user by Instagram, posted by accounts that the user does not follow. The content recommended to users is mediated by Instagram’s Recommendation Guidelines. There are clear content types that Instagram does not allow that would (largely) not be recommended on the Explore page—such as hate speech or bullying. However, outside of these defined categories, certain types of content recommended to users may affect them based on their individual concerns. This kind of content is not intuitively harmful to a general audience—so, it wouldn’t be banned. However, it may still be ‘sensitive’ for specific users. To counter this, Instagram launched Sensitive Content Control on July 20, 2021. The feature allows users to adjust how much content that is ‘sensitive’ to their individual experiences they’d like to see on their Explore page. Instagram argues the tool gives people ‘more choice over what they see.’ The feature launched with three options for users to moderate sensitive content. Selecting ‘More’ indicates seeing more sensitive content. ‘Standard’ indicates seeing some sensitive content. ‘Less’ indicates seeing…
