Processing Children’s Data Correctly: Takeaways from the Recent TikTok Decision
This decision will be of interest to controllers who are processing children’s personal data. It illustrates the importance of carrying out comprehensive data protection impact assessments (“DPIAs”), implementing appropriate technical and organisational measures (“TOMs”), using clear language in customer communications, including clear directions on registration and in the terms and conditions governing the service.
1.1 Inquiry and Response:
The inquiry looked at three areas of data processing by TikTok: platform settings for child users, age verification; and transparency information for children. In its draft decision, the DPC concluded that TikTok had infringed Articles 5(1)(c), 5(1)(f), 24(1), 25(1), 25(2), 12(1) and 13(1)(e) of the GDPR. Following objections by Concerned Supervisory Authorities in Italy and Berlin and the subsequent intervention of the European Data Protection Board (“EDPB”) under the Article 65 dispute resolution mechanism, the final decision (referenced IN-21-9-1) (the “Decision”) includes a finding of breach in relation to those GDPR Articles identified in the draft decision and an additional finding of a breach in respect of the fairness principle under Article 5(1)(a) GDPR.
Responding to the Decision, TikTok has noted that the DPC’s criticisms focused on features and settings that “were in place three years ago” and has since launched Judicial Review proceedings seeking various orders and declarations against the DPC and the Attorney General and claiming that the DPC’s decision and findings are flawed and should be set aside. Notably, the inquiry examined the processing of children’s data by TikTok between 31 July 2020 and 31 September 2020 (the “Relevant Period“) so the outcome of the judicial review proceedings will be keenly awaited.
1.2 Platform Settings for Child Users: Public-by-Default
A key contributor to many of the infringements of the GDPR identified by the DPC, was its finding that throughout the Relevant Period, the settings that TikTok had implemented for registered EU TikTok users between the ages of 13 and 18 (“Child Users”), were public-by-default, which gave rise to a number of risks for child users.
While the Child User on the TikTok platform was, during the Relevant Period, prompted to select between ‘Go Private’ and remaining public, the DPC noted they could opt to ‘skip’ this and commented that the use of this language “would seem to incentivise or even trivialise the decision” to opt for a private account. (These instructions would be further examined by the EDPB following the Berlin objection.)
Key Takeaway: Exercise additional care in designing platform settings for child users. Processing of children’s personal data must be considered in the context of Recital 38 of the GDPR, which provides that children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and of their rights in relation to the processing of personal data. Public-by-default settings have previously been considered by the DPC in its 2022 Instagram decision, in which there was also a finding of infringement as regards public-by-default settings for the accounts of child users. The DPC states in the Decision that it is not sustainable to suggest that the risks posed by public accounts are “inherent to Child Users on the internet more generally”. Risks as regards platform settings for child users should be assessed by reference to the GDPR and not the norms of the internet generally.
As the DPC’s Fundamentals for a Child-Orientated Approach to Data Processing (the “Fundamentals”) came out in December 2021, after the Relevant Period, the DPC agreed that it would be deleterious to fair procedures to determine TikTok’s compliance with the GDPR by reference to the guidance in the Fundamentals. Of course, for processing after that date, the guidance in the Fundamentals, while not legally binding per se will likely be a critical reference point for future inquiries.
Findings one and two:
With a public account, anyone (on or off TikTok) could access, view and process the Child User’s social media content leading “in the first instance to Child Users losing autonomy and control over their data and, in turn, they could become targets for bad actors, given the public nature of their use of the TikTok platform.” Further, the public-by-default setting had a number of “cascading implications for other platform settings for the Child User” as regards other features of the platform. The DPC also determined that by expecting Child Users of 13 years to have the technical knowledge to change the public-by-default setting, TikTok created conditions in which unnecessary publication of Child Users’ social media may occur.
Accordingly, the DPC’s first finding is that TikTok failed to implement adequate TOMs to ensure that by default, only personal data which were necessary for TikTok’s purpose of processing were processed. Further, in enabling the social media content of Child Users to become accessible (save where the user intervened) to an indefinite number of people, the DPC found that the processing was contrary to the principle of data protection by design and by default and contrary to the principle of data minimisation. In its second finding, the DPC identified a failure to implement appropriate TOMs to ensure processing was in compliance with the GDPR, contrary to Article 24(1) GDPR. It found that TikTok had not properly taken into account the rights and freedoms of Child Users when implementing measures to ensure compliance with the GDPR.
Notably TikTok furnished to the inquiry a number of DPIAs that covered the processing activities undertaken on the data of users under the age of 18 years. The DPC found that the measures implemented to address the risks it identified were not effective. Further, the DPIA in relation to Children’s Data and Age Appropriate Design did not identify the risk of children under 13 years of age accessing the platform and the further risks that apply.
- Key Takeaway: Ensure DPIAs are complete, comprehensive and consider all risks, taking into account the rights and freedoms of child users. (The DPC has published a Guide to Data Protection Impact Assessments.)
Finding three:
The ‘Family Pairing’ platform setting allowed a non-Child User pair their account with that of the Child User. This posed severe risks to the rights and freedoms of Child Users as the setting allowed an unverified non-Child User to access and control an (intended) Child User’s platform settings. Accordingly the DPC’s third finding is that this processing does not ensure appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate TOMs and TikTok failed to implement appropriate TOMs designed to implement “the integrity and confidentiality principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and to protect the rights of data subjects”.
Finding four:
Taking account of the implications of children under 13 years of age gaining access to the platform in circumstances where the settings were public-by-default and TikTok’s failure in its DPIA to address the specific risk of children under the age of 13 years gaining access to the platform, the DPC again identified a failure to implement appropriate TOMs to ensure such processing was in compliance with the GDPR, contrary to Article 24(1) GDPR.
Following objections by the Italian supervisory authority, the EDPB examined TikTok’s age verification measures further. It found that due to a lack of information, it was not in a position to conclude that the TOMs in respect of age verification processes undertaken by TikTok infringed Article 25(1) GDPR.
Key Takeaway: Periodically review TOMs implemented to comply with Article 25(1), taking into account changes in relevant risks over time and the development of the state of the art.
Finding 5:
The DPC’s fifth finding is that TikTok failed to comply with its obligations under Article13(1)(e) GDPR. Further, as it did not provide Child Users with information on the scope and consequences of public-by-default processing, it failed to comply with Article 12(1) GDPR. Critically, in its privacy policy TikTok did not clearly state that content of public-by-default accounts would be visible to non-registered users.
1.3 Deceptive Design Practices
Following an objection raised by the Berlin supervisory authority, the EDPB undertook an analysis of design practices implemented by TikTok in the context of two pop-up notifications that were shown to children aged 13-17, namely the Registration Pop-Up and Video Posting Pop-Up. The EDPB agreed with the objecting authority in finding that these pop-up notifications were “nudging the user to a certain decision” and “leading them ‘subconsciously to decisions violating their privacy interest’”. In so doing, TikTok infringed the principle of fairness pursuant to Article 5(1)(a) GDPR. Examples of nudging identified include:
- In the Registration Pop-Up, requiring users to positively opt to choose a private account, since the ‘Skip’ option led to the account being public by default.
- Referring, in the Video Posting Pop-Up, to the possibility of changing preferences in the Privacy settings, without including a direct link to these settings.
- In the Video Posting Pop-Up, nudging users to click on the option to post the video publicly, by presenting this option in a bold, darker text, on the right side (leading a majority of users to choose it due to muscle memory), in contrast to the lighter button to ‘cancel’.
- Key Takeaway: Options for Child Users in relation to sharing data should be provided in an objective and neutral way.
Article provided by INPLP members: Rob Corbet (Arthur Cox, Ireland)
Discover more about the INPLP and the INPLP-Members
Dr. Tobias Höllwarth (Managing Director INPLP)