In an insignificant blow to the company, Google agreed to pay a $170 million fine, or a 0.4% of its Q1 $39.3 billion revenue, to settle claims by the Federal Trade Commission (FTC) and New York Attorney General that YouTube knowingly and illegally harvesting personal information from children without their parents’ consent. Of this $170 million, $136 million will be paid to the FTC, and $34 million to the State of New York.

New York Attorney General Letitia James said that these companies “put children at risk and abused their power”. As per the FTC’s press release, this is “the largest amount the FTC has ever obtained in a COPPA [Children’s Online Privacy Protection Act] since Congress enacted the law in 1998”. It certainly dwarfs the $5.7 million TIkTok settlement, which was reached on similar grounds, but compare this to the $1.7 billion fine imposed by the European Union on Google in an anti-trust probe into the company thwarting advertising rivals. Numerous privacy advocates and lawmakers have criticised the terms of the settlement for being too paltry.

This settlement is a result of FTC’s investigation into YouTube that the Commission launched in 2018 after receiving numerous complaints from consumer groups and privacy advocates, including Campaign for a Commercial-Free Childhood and Center for Digital Democracy.

YouTube collected personal data of kids without

FTC and NY AG’s complaint against Google and Facebook, which echoes a number of allegations that complainants to FTC had made in 2018 and early 2019, alleged that:

  1. YouTube violated COPPA Rule by collecting personal information, through cookies, from viewers of child-directed channels without first notifying and getting consent from their parents.
  2. Since some of YouTube channels are child-directed, they must comply with COPPA.
  3. YouTube knew that some of its channels were child-directed and marketed itself as a “top destination” for kids to makers of popular children’s products and brands such as Mattel and Hasbro, and served targeted advertisements on these channel. But, the company told one advertsing company that “it did not have users younger than 13 on its platform and therefore channels on its platform did not need to comply with COPPA”
  4. How did YouTube know about child directed content?
    • Channels owners themselves told YouTube and Google that their content was child-directed
    • YouTubes own content rating system identified content as directed to children
    • YouTube manually reviewed content on main platform to feature in its YouTube Kids app

What does COPPA say?

  1. Notify and seek consent from parents about information practices: Child-directed websites and online services must notify parents about their information practices and obtain parental consent before collected personal information from children under 13, including using cookies used for targeted advertising.
  2. Third parties also subjected to COPPA when they know that they are collecting personal information from users of child-directed websites and online services.

Terms of settlement

  1. $136 million to be paid to FTC, $34 million to the State of New York
  2. Google and YouTube are prohibited from violating COPPA Rule
  3. Google and YouTube must:
    • Provide notice about their data collection practices and obtain verifiable parental consent before collecting personal information from children
    • Develop, implement and maintain a system so that channel owners can identify child-directed content on YouTube so that YouTube can ensure compliance with COPPA
    • Notify channel owners that their child-directed content may be subjected to COPPA Rule’s obligations
    • Provide annual training about COPPA compliance to employees who deal with YouTube channel owners

YouTube sticks a band-aid on a gushing wound

As the news of this settlement broke, YouTube’s CEO Susan Wojcicki announced that the company would implement the following changes to its platform:

  1. New data collection practices for children’s content on YouTube: treat all viewers of children’s content as children, regardless of age of the user and make the following changes:
    • Limit data collection, collect only what is necessary to support service operations
    • Stop serving personalised ads: FTC had, in early July, proposed that YouTube could disable advertising on individual channels to ensure compliance with law instead of setting up a different platform for children’s content.
    • Disable certain features such as comments and notifications
  2. Identify content for children via self-designation by creators, and use machine learning to find videos that target kids
  3. Improvements to YouTube Kids: Read about the problems with this app here.
  4. Mitigate effect on family and kids creators by working with them
  5. $100 million fund for original children’s content disbursed over three years
  6. Mandatory annual training for teams working in this area

These changes will be implemented in the next four months, as per YouTube.

Things to consider: MediaNama’s take

There are few things that need to be considered:

  • Violation not established: As this is a settlement, there is no admission of Google or YouTube’s guilt over violation of the COPPA Rule. It is apparent that the companies were in violation, but it hasn’t been established unequivocally by a court. Perhaps that is why even FTC’s press release continues to call it an “alleged violation”. And that is problematic.
  • Loose change isn’t a fine fine: The trivial amount, masquerading as a fine, evacuates any hope of setting a deterrent for the Big Tech. The two FTC Commissioners who dissented from the settlement, Rebecca Kelly Slaughter and Rohit Chopra (“Commission brings down the hammer on small firms, while allowing large firms to get off easier”), also acknowledge the severe limitations of the settlement at large, and the fine in particular. It was expected that FTC would slap a penalty that was heavy by Google’s standards, and not by FTC’s standards, as has happened. Until and unless the fines are big enough, such as EU’s $1.7 billion fine, to hurt Big Tech’s revenues and shareholders, tangible change in favour of privacy protection will remain elusive.

  • Shifting onus to content creators mitigates YouTube’s responsibility: There is no provision in the settlement for penalizing, either by YouTube or by FTC, content creators who mis-designate child directed content in order to monetise their channels through personalised ads. Burden for deciding child-directed content has shifted to YouTube creators, but in the absence of company oversight, workarounds and violations will be rampant.

Big Tech’s repeated failure to protect children online

YouTube

  • In June 2019, Harvard University researchers said that they found that the recommendation system of the platform and its default ‘Play next’ option that respond to viewers’ viewing habits were recommending videos of prepubescent children to paedophiles to keep them on the website longer, the WSJ reported.
  • In February 2019, YouTube disabled comments on most videos that feature minors after reports emerged that paedophiles had used comments to find, track and exploit children.

Facebook

  • In July 2019, a design flaw in Facebook’s Messenger Kids app allowed children to enter group chats with unapproved strangers, defeating the entire premise of the app that children shouldn’t be allowed to talk to users who haven’t been approved by their parents. The company acknowledged this flaw only at the end of August 2019.
  • In June 2019, it was revealed that through its now-defunct Facebook Research app, Facebook collected data on 34,000 teenagers, aged between 13 and 17 years old. 4,300 of these teenagers were based in the US, while the remaining 29,700 teenagers were from India. In total, Facebook had clandestinely collected personal and sensitive data, including all of a user’s phone and web activity, across all apps, including search history, browsing habits, and content of unencrypted messages, of 156,000 users. Despite the backlash that it has faced over Facebook Research from both the industry and the lawmakers, Facebook refuses to learn: just two days ago, it launched its new market research app, Study. It is essentially the same app as Research, but only for Android devices.

Amazon

  • In June 2019, a lawsuit was filed in a federal court in Seattle, alleging that Amazon’s voice assistant, Alexa, records children who use it without their consent and saves their recordings permanently.

TikTok

  • In July 2019, the United Kingdom started investigating video-sharing app TikTok for how it handles children’s personal data, and whether it prioritises the safety of children on its platform.
  • In April 2019, TikTok was banned in India by the Madras High Court, which said it was spreading pornography, potentially exposing children to sexual predators, and adversely impacting its the mental health of its users. The ban, however, was lifted on April 24 after the Supreme Court intervened.
  • In February 2019, TikTok was fined $5.7 million by the FTC for violating COPPA for illegally collecting personal information of children below 13 years of age without parental consent.