The Federal Trade Commission, US government’s agency that is responsible for customer protection, is in the late stages of an investigation into YouTube for allegedly violating children’s privacy, the Washington Post reported on June 19. The probe has already prompted the company to re-evaluate some of its business practices, and may result in a potential fine.

Consumer complaints led to the investigation

After receiving numerous complaints from consumer groups and privacy advocates, FTC launched its investigation in 2018. The crux of the complaints was that YouTube violates the Children’s Online Privacy Protection Act (COPPA) and FTC’s COPPA Rule as it improperly collected children’s data without parental consent.

Complainants’ allegations:

  • YouTube is the most popular online platform amongst children to the extent that 80% of children in the US between 6 and 12 years old use YouTube daily. Despite knowing that children under 13 use YouTube, Google still collects and uses personal information from all YouTube users ‘without giving notice, or obtaining advanced, verifiable parental consent as required by COPPA’. In fact, many YouTube content creators ‘communicate the child-directed nature of their content’ in the “About” section of their channels.
  • YouTube does not have a separate privacy policy for children, nor does their privacy policy mention children. As a result, Google uses information collected from all YouTube users to serve advertisements.
  • YouTube representatives have publicly recognised the child-directed content on YouTube, especially during the launch of YouTube Kids app in 2015.
  • The fact that children’s content is available on both YouTube and YouTube Kids shows that Google knew a large portion of YouTube would remain child-directed. Citing survey by Common Sense, the complainants contended that more children use YouTube than they use the YouTube Kids app.
  • Google profits from YouTube’s kid-targeted programming. One of the most popular YouTube channels for children, Ryan ToysReview, brought in $11 million in ad revenue in 2017. YouTube’s share was 45% of that amount, or almost $5million.

FTC may penalise YouTube heavily

Hefty penalty against YouTube, including a settlement forcing YouTube to change its practices to better protect kids are reportedly being considered.

In February 2019, FTC fined Musical.ly, which had merged with TikTok in 2018, $5.7 million for illegally collecting the names, email addresses, pictures and location of kids under age 13, after a settlement was reached between the FTC and TikTok. The penalty is the highest since COPPA was put in place in 1998. The violations predate the merger between the two companies. In response, TikTok said that it would require new users to verify their age and prompt existing users for age verification too.

Recent steps taken by YouTube to protect children

The Washington Post reported that as the FTC investigation of YouTube has progressed, ‘company executives in recent months have accelerated internal discussion about broad changes to how the platform handles children’s videos’. These changes may include alterations to its algorithm for recommending and queuing up videos for users.

The Wall Street Journal, on June 19, reported that YouTube is considering moving all children’s content to the YouTube Kids app to protect children. Few YouTube employees are also pushing for disabling automatically playing a new video in children’s programming, WSJ reported.

The investigation puts YouTube’s renewed focus on protecting children online into a fresh perspective. Despite the focus of the investigation on children’s privacy, YouTube has had a troublesome time protecting children online. Recent steps to protect children, both as subjects and viewers of content, detailed in a June 3 blog post, taken by the company include:

  • Video removal and channel termination: YouTube enforces a 3-strike system when dealing with content that is in violation of its Community Guidelines. This results in temporary suspension of the channel, and if repeated within a 90-day period, permanent termination. Videos that compromise child safety are subject to this system. In the first quarter of 2019, of all the channels and videos that YouTube removed, 2.8% or 78,507 channels, and 9.7% or 807,676 videos, respectively were removed for compromising child safety.
  • Restrict live features: Younger minors, unless clearly accompanied by an adult, will be disallowed from live streaming. It’s not clear how YouTube intends to track live streams to determine whether a child is being accompanied by an adult in a stream.
  • Disable comments on videos featuring minors: Comments are disabled on videos featuring minors.
  • Reduce recommendations: Any videos that feature minors in risky situations and are borderline in terms of guidelines violation will be recommended in a limited manner.
  • CSAI Match technology: YouTube’s proprietary technology that identifies known Child Sexual Abuse Imagery (CSAI) content online. Once a match of CSAI content is found, it is flagged to YouTube partners so that it can be reported according to locals laws and regulations. This is designed for video, but Google’s Content Safety API offers machine learning based classification for images.
  • YouTube Kids: App with content only for children under 13 with greater controls for parents. Since children under 13 are not allowed on YouTube, such accounts, when discovered are terminated.

How effective have those steps been?

A number of problems highlighted by the YouTube investigation are common to most popular online services, including Instagram, Snapchat and Fortnite. While these apps are ostensibly meant for adults and action is taken against users are found to be underage, sidestepping the age restriction is literally child’s play.

MediaNama reviewed the YouTube Kids app: It is a signed-out service, that is, it does not generally allow the child to sign in. It uses ‘automated systems to choose content’ from all YouTube videos. The company doesn’t manually review all videos because of which ‘your child may find something that you don’t want them to watch’.

Screenshot from YouTube Kids app

To unlock the app, a parent just needs to enter their birth year, a step that’s easy to get around. I entered a birth year and easily got in. I deleted the app and tried the process again, but entered 2009. This is the prompt I got:

The app’s blocking and reporting policy states that ‘if you’re signed in and block or report a video, you will no longer see the video on the YouTube Kids app’. However, when I blocked a Smurfs video for a child profile Bingo after signing in, the video still appeared within the child’s profile. Furthermore, when you use YouTube Kids as a signed-out service, blocked videos aren’t saved.

Screenshot from YouTube Kids app

Since it is a signed-out service, it doesn’t collect personal identifiable data, but it collects all kinds of metadata and information about usage habits.

Screenshot from YouTube Kids app

To access app settings, the app asks answers to simple arithmetic questions such as 5×8, 7×4, 9×5 or asks you to set up a passcode. If you set up a simple passcode, such as 1234 or 0000, the app doesn’t prompt you to change it for being too easy.

Screenshot from YouTube Kids app

If you forget the passcode, you can simply uninstall and reinstall the app. And we have seen how easy it is to set up parental controls. Recommended videos are problematically an opt-out feature. When the search feature is switched on, there are certain terms that don’t yield results.

Screenshots from YouTube Kids app

Considering that YouTube Kids is an app for children up to 13 years old, these are laughably easy hurdles to get through. A child aged 7 knows basic arithmetic functions, including multiplication and division. Even if a parent is absolutely diligent, a child can easily change passcodes, and answer questions to change the parental settings. These steps by YouTube might conform to the legal COPPA requirements, but do not adhere to the spirit of the law. MediaNama has reached out to Google for comment.

YouTube’s continued struggles with privacy and content moderation

  • In June 2019, the company finally updated its policy and said that it would remove and prohibit hateful and supremacist content on its platform.
  • Earlier this month, Harvard University researchers said that they found that the recommendation system of the platform and its default ‘Play next’ option that respond to viewers’ viewing habits were recommending videos of prepubescent children to paedophiles to keep them on the website longer, the WSJ reported.
  • At the Code Conference in Scottsdale, Arizona, last week, YouTube CEO Susan Wojcicki apologised to the LGBT community for failing to take more definitive action against conservative commentator Steven Crowder’s channel that has nearly 4 million YouTube subscribers, the Verge reported. YouTube had received a lot of backlash when it had  tweeted that Crowder had not violated YouTube’s policies despite having used racial language and homophobic slurs to harass Carlos Maza, a Vox journalist, for nearly two years. Soon after the backlash, YouTube backtracked and suspended monetisation on Crowder’s channel. Even then, YouTube reiterated in its blog that individually, Crowder’s flagged videos had not violated their community guidelines, but the “widespread harm to the YouTube community resulting from the ongoing pattern of egregious behaviour” prompted them to take this action.
  • In February 2019, YouTube disabled comments on most videos that feature minors after reports emerged that paedophiles had used comments to find, track and exploit children.
  • In August 2018, YouTube terminated Alex Jones’s account on the platform for violating YouTube’s Community Guidelines. The company had, in July 2018, issued a strike against his channel Infowars for violating child endangerment and hate speech policies.