Users can now moderate all platform services recommending potentially sensitive content.
On June 6th, Instagram announced updates to its ‘Sensitive Content Control’ feature. The tool can now be used to moderate content on all platform services that make recommendations to users, such as ‘Search, Reels, Accounts You Might Follow, Hashtag Pages and In-Feed Recommendations.’ The update will be available for users within a few weeks.
What is Sensitive Content Control?
The Explore page largely consists of content recommended to the user by Instagram, posted by accounts that the user does not follow.
- The content recommended to users is mediated by Instagram’s Recommendation Guidelines. There are clear content types that Instagram does not allow that would (largely) not be recommended on the Explore page—such as hate speech or bullying.
- However, outside of these defined categories, certain types of content recommended to users may affect them based on their individual concerns. This kind of content is not intuitively harmful to a general audience—so, it wouldn’t be banned. However, it may still be ‘sensitive’ for specific users.
- To counter this, Instagram launched Sensitive Content Control on July 20, 2021. The feature allows users to adjust how much content that is ‘sensitive’ to their individual experiences they’d like to see on their Explore page.
- Instagram argues the tool gives people ‘more choice over what they see.’ The feature launched with three options for users to moderate sensitive content. Selecting ‘More’ indicates seeing more sensitive content. ‘Standard’ indicates seeing some sensitive content. ‘Less’ indicates seeing less sensitive content.
What Are the New Updates?
With Monday’s announcement, aside from the Explore page, Sensitive Content Control is now also available on various surfaces of Instagram: Search, Reels, Accounts You Might Follow, Hashtag Pages, and In-Feed Recommendations.
- Now, users can moderate sensitive content using three modified options—that attempt to protect the user by design. ‘Standard’ is the default option, and prevents people from seeing sensitive accounts and content. ‘More’ allows people to see more sensitive accounts and content. ‘Less’ allows them to see less sensitive accounts and content compared to ‘Standard’.
- Additionally, the technology used to enforce Instagram’s Recommendation Guidelines on content will also be applied to the platform’s Search and Hashtag pages.
Why Is Instagram Developing More Content Moderation Tools?
Over the past year, Instagram came under criticism in the United States for its largely negative influence on adolescent users—a phenomenon it knew about, but did little to resolve. After public outcry, the platform developed similar preventive measures to protect vulnerable users from harm while using it. Amidst growing criticisms of the platform’s overall content moderation policies back in India and other countries, these updates may just be merely individual bandaids for a larger, systemic issue.
- Facebook Knew Details About Instagram’s Impact On Mental Health Of Teenagers: Report
- Teenagers On Instagram May Soon Be Nudged To Look The Other Way On Harmful Content
- Instagram Announces Three New Safety Measures For Young Users, Including Limiting Advertisers’ Reach
- Instagram Beats Own Record On Number Of User Complaints In A Month