The global country that is Facebook will hire 3000 people by the end of this year to "review the millions of reports we get every week, and improve the process for doing it quickly", its founder (and head of state) Mark Zuckerberg said in a post today. Facebook reviews content according to its community guidelines, which disallow hate speech and child exploitation. "And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -either because they're about to harm themselves, or because they're in danger from someone else." Policing content via community is going to be tough Facebook doesn't actively police content, but relies on its community to flag content, which is then reviewed. This is, in a manner of speaking, reactive policing. This essentially absolves Facebook of prior knowledge before content is posted, and allows it to play the role of an intermediary and a tech platform merely allowing others to publish content. If it were to screen content, it would, like media publications, have liability over that content. Facebook's challenge with live streaming is far greater than policing hate speech, and it includes suicide, murder, and to people considering committing suicide. Zuckerberg, in his post, says "Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate." Earlier last…
