TikTok has found itself on the EU’s radar due to the app’s policies for underage users and data transfers to China. Ireland’s Data Protection Commission, the EU’s lead data privacy regulator, launched two investigations into TikTok’s compliance with the European Union’s General Data Protection Regulation (GDPR) on September 14, the regulator announced on their website.
The first inquiry will examine TikTok’s processing of children’s data, and the second will look into the app’s transfer of personal data to China. If TikTok is found in violation of GDPR in either inquiries, the regulator can impose a fine of up to 4% of the app’s global revenue.
An investigation from the EU into TikTok’s compliance with the GDPR could nudge regulators across the world to look into the app’s data privacy standards.
Children’s Data: What EU wants to find out and what GDPR says
In the first TikTok inquiry, Ireland’s Data Protection Commission will look into three areas, according to a press release from the commission’s official website:
- TikTok’s design and default settings for the processing of personal data for users under age 18
- Age verification measures implemented by the app for persons under 13.
- Compliance with the GDPR’s transparency obligations for processing personal data of users under age 18
Article 8 of the GDPR emphasises that companies can only process the data of children after prior, explicit consent is taken. Here are the specific rules companies processing such personal data need to adhere to:
- Age Limit: Processing of personal data is only lawful for users that are 16 years or older. To process data of younger children, the service provider would have to seek consent for such processing from a parent or guardian.
- Verification of Consent: In the case of users younger than 16, service providers must make reasonable efforts to verify that consent is given by a parent or guardian.
- Nature of Consent: Service providers need to seek consent from young users (or from guardians of users who are below 16) that is specific, free, and explicit as well as informed and unambiguous.
Is TikTok’s data transfer to China in line with GDPR?
Here are the key clauses of the GDPR that will be relevant in the inquiry focused on whether TikTok is sharing personal data of users to China:
- Secure vs Unsecure Countries: According to the GDPR, secure countries are those which the EU has ascertained to have suitable levels of data protection in an adequacy decision. The list of secure countries in the GDPR does not include China.
- For Unsecure Countries: Data controllers have to ensure that data is sufficiently protected by the recipient through standard contractual clauses (SCCs).
- Consent Requirement: In case a controller wants to transfer data to an unsecure country without an SCC, they would require the free and explicit consent of users to do so.
MediaNama has reached out to TikTok for comment regarding its data transfer protocols and will update the report when we receive a response.
Steps TikTok has taken for underage users so far
In the backdrop of increasing criticism about the app’s failure to institute child-safe policies, the company has made a slew of changes this year to default settings and features available to under-18 users:
- Direct Messages: 16 and 17-year-old users will have limited direct messaging on their accounts, TikTok announced in August. Access to direct messaging for users below 16 was entirely revoked in April 2020.
- Notifications: The app will send no notifications to under 16 users after 9 pm and users who are 16 or 17 will get notifications only till 10 pm, the company announced in August.
- Account Privacy: In January, TikTok decided to make all under-16 accounts private by default. The app’s ‘suggest your account to others’ feature was also turned off by default for under-16 users.
- Video Downloads: Videos created by users under-16 cannot be downloaded from the TikTok app, the company announced in January. Permission to download videos is set to ‘off’ by default for users between 16 and 17, which they can choose to enable.
What India’s data protection bill says about children’s data
The draft Personal Data Protection (PDP) Bill, 2019 has defined guardian data fiduciaries (GDF) as entities that:
- Operate commercial websites or online services directed at children or
- Process large volumes of personal data of children.
What are the responsibilities of such GDFs under the draft Personal Data Protection Bill?
- GDFs are prohibited from “profiling, tracking or behaviourally monitoring or direct targeted advertising at, children”. They cannot process children’s data in a way that causes “significant harm” to the child.
- GDFs are supposed to verify the age of their users, and obtain consent from their guardian or parents if the user is under 18.
- Failure to adhere to the provisions can attract a fine of Rs. 15 crore or 4% of the company’s global turnover.
In a MediaNama discussion on children’s data and the Personal Data Protection Bill , we discuss how these fiduciaries will comply with this complex mandate. In another discussion, we also consider if there should be a blanket age of consent for using online services.
- Personal Data Protection Bill, 2019: Protecting Children’s Data Online
- TikTok Announces More Controls For Underage Users Globally
- Italy Orders TikTok To Block Users Whose Age It Can’t Verify
Have something to add? Post your comment and gift someone a MediaNama subscription.