Instagram users will now be able to delete up to 25 comments at once on their posts, and block or restrict multiple accounts at the same time, Facebook announced on March 12. The company said it’s been testing these features, and early feedback showed they helped people, especially with larger followings, maintain a “positive environment.” Instagram will also allow users to manage mentions and tags, and will soon allow them to pin comments (more on that below). Last year, Instagram had started notifying people “when their comment may be considered offensive before it’s posted”.

On iOS, users can tap on the three dot icon in the comments section of their post, select Manage Comments and then choose up to 25 comments to delete at once. Options to block accounts in bulk can be found under the More Options tab. On Android, users can press and hold on a comment, tap the dotted icon and select Block or Restrict. MediaNama could use the feature on both iOS and Android.

Instagram

Source: Instagram

Manage mentions and tags on comment, caption or Story: Users can now also manage who is allowed to mention or tag them on Instagram. They can choose whether they want everyone, only people they follow, or nobody to be able to tag or mention them in a comment, caption or Story. Users can access these options under privacy settings in the latest version of the app. Instagram says that tags and mentions can be used to “target or bully others”, which was the rationale behind rolling out this particular feature.

Instagram

Source: Instagram

Soon, pinned comments on Instagram: The platform will “soon” also allow users to pin comments a “select number of comments” to the top of their comments thread. According to screenshots shared by Instagram, users will be able to pin 3 comments, and the platform will notify users who posted the comment. It isn’t clear when this feature will roll out, and we couldn’t access it at the time of publishing.

Instagram

Source: Instagram

Instagram took action on over a million pieces of content for bullying and harassment: Instagram parent Facebook said it took action against against 1.5 million pieces of content on Instagram in both the fourth quarter of 2019 and the first quarter of 2020 because of “bullying and harassment violations”. However, most of the content was reported by users, and not proactively discovered by Facebook, the company said. Instagram claimed to have improved its text and image matching technology to help find more “suicide and self-injury content”. This led to an increase in the amount of content Instagram took action on by 40%, and increased its proactive detection rate by more than 12 points, the platform claimed.