Youtube has released its first quarterly transparency report that highlights the volume of content that has been flagged or removed from the video sharing platform between October and December 2017. According to the report a total of 9.3 million unique videos were flagged by users, users from India flagged more videos than users from any other country, followed by users from United States and Brazil.
Interestingly the report also highlights, how ineffective the process of human flagging can be. During this period a total of 8.3 million videos removed from the site. 8.3 million out of 9.3 million flagged doesn’t seem too bad, right? Except 6.6 million of the videos that were removed were caught by Youtube’s algorithm and only the remaining 1.7 million videos were caught by humans. The report states that out of the videos taken down, 75% were never actually seen by the public. But since 25% of them were, there is the counter-argument that it is the public that is doing much of the job YouTube should be doing, acting as moderators of the site by flagging content where necessary.
Here’s a full breakdown of the type of videos that were taken down by Youtube:
- Sexual content: 30.01%
- Spam or misleading: 26.4%
- Hateful or abusive: 15.6%
- Violent or repulsive: 13.5%
- Harmful dangerous acts: 7.6%
- Child abuse: 5.2%
- Promotes terrorism: 1.6%
Youtube had a bad 2017
The video-streaming service had a rough year (2017), where YouTube was boycotted by advertisers and roundly criticized for having its ads pop up in hateful videos, including those uploaded by terrorist sympathizers and white supremacists, the company announced tighter rules for its YouTube Partners Program last month. The company also announced that it will begin to manually curate Google Preferred, which lists its most popular content. The whole series of YouTube’s step comes after Logan Paul, a popular YouTube star, shot a video of himself laughing next to a corpse in a Japanese forest known for being a suicide hotspot. Paul later took the video down and apologized, amid harsh criticism. However, he is back again where he now paints himself as a victim.
Change in policies
In February, YouTube said that it is updating its policies to prevents issues like online harassment, self-harm, and abusive behaviour. In a blogpost, CEO Susan Wojcicki charted out some goals for 2018, in which she also expanded on how YouTube is working towards tightening its policies for creators. She also said that some policies were a simple reaffirmation of the services in place, like preventing people from impersonating other channels or using misleading thumbnails.
The company also announced that it will now label videos uploaded by news broadcasters that receive some level of government or public funding. The label will also link to the news publisher’s Wikipedia page, so that viewers can learn more about the news broadcaster. The feature is currently up in US only.