Taking a cue for Facebook and Twitter, now YouTube is updating its policies to prevents issues like online harassment, self-harm, and abusive behavior. In a blogpost, CEO Susan Wojcicki charted out some goals for 2018, in which she also expanded on how YouTube is working towards tightening its policies for creators. She also said that some policies are simple reaffirmation of the services in place, like preventing people from impersonating other channels or using misleading thumbnails.
Susan wrote that in the year ahead, the company will improve the enforcement of existing policies, with a combination of technologies like machine learning and human policing, for which it will bring together 10,000 number of employees to monitor the content and manually flag off any disputable content on the platform.
To work on the issue of hate speech, YouTube has been working with Anti-Defamation League in the US, and on issues of self-harm, it is working with National Suicide Prevention Lifeline.
In her five goals, Susan touched upon that the company is testing new ways for creators to earn revenue or raise money right on their channel through donations, merchandise and ticketing sales. And that the company will push launch of its premium subscription service YouTube Red in new markets. For transparent communication with creators on the platform, YouTube has setup two official Twitter accounts – @YTCreators and @TeamYouTube – to keep creators informed, answer questions, and resolve creators’ issues. YouTube will also send developments and video tips via mail to the creators who sign in here.
Against fake news and misinformation
In a separate blog, YouTube announced that it will now label videos uploaded by news broadcasters that receive some level of government or public funding. The label will also link to the news publisher’s Wikipedia page, so that viewers can learn more about the news broadcaster. The feature is currently up in US only. On how the label will appear, see the pic below:
The video-streaming service had a rough year (2017), where YouTube was boycotted by advertisers and roundly criticized for having its ads pop up in hateful videos, including those uploaded by terrorist sympathizers and white supremacists, the company announced tighter rules for its YouTube Partners Program last month. The company also announced that it will begin to manually curate Google Preferred, which lists its most popular content. The whole series of YouTube’s step comes after Logan Paul, a popular YouTube star, shot a video of himself laughing next to a corpse in a Japanese forest known for being a suicide hotspot. Paul later took the video down and apologized, amid harsh criticism. However, he is back again where he now paints himself as a victim.