15 human and digital rights organisations have released an updated version of the Santa Clara Principles. The Santa Clara Principles, which were first released in 2018, laid down minimum requirements of transparency and fairness in social media companies' content moderation processes. The new principles expand on those requirements and ask companies to consider human rights, cultural contexts, clarity, state involvement, algorithmic fairness in enforcing their policies, while making disclosures about the same. The principles also have a few recommendations for governments and how they deal with social media companies. According to Wired, the principles were endorsed by various social media companies like Facebook, YouTube, Twitter, Reddit, GitHub after they were released. The new principles could lead to new transparency and content moderation policies followed by social media companies. What are the new principles? Human rights and due process Companies should consider human rights and due process in their content moderation processes, using automated processes only when there is high confidence in their accuracy, and providing users with clear appeals to moderation decisions/actions, the principles say. They should do so by: Clearly outlining to users how they integrate human rights considerations in their content moderation policies. Informing users of the extent to which it uses automated processes in content moderation and how it takes human rights considerations therein. How the company has considered the importance of due process in the enforcement of its rules and policies, and they have maintained its fairness and integrity. Clear rules and policies Content moderation rules…
