In an age where screens dominate, TikTok has quickly become the app of choice for the next generation of socializers. TikTok reports that nearly 20 millions of its daily users in the United States are 14 years old or younger, which accounts for roughly one third of the app’s users across all demographics. With this trend comes questions of what the company is doing, or rather not doing, to protect the youngest of its consumers. The increasingly popular short-form video platform has received growing criticism and controversy in recent years due to allegations of child data privacy violations. Concerns have been emerging among both the public and government agencies regarding how the platform collects, processes, and handles the personal information of its youngest users.
Just over a month ago, TikTok received a $370 million fine from the Irish Data Protection Commission (DPC) for violating children’s data privacy law under the General Data Protection Regulation (GDPR). This decision came to fruition after the DPC launched an investigation on how TikTok processed the data of children between July 31, 2020 and December 31, 2020. The DPC discovered a laundry list of GDPR violations on behalf of TikTok including breach of lawfulness, fairness, and transparency of data processing; data minimization; data security; responsibility of the controller; data protection by design and default; and the rights of minors to receive clear communications about data processing.
The DPC’s decision focused primarily on TikTok’s failure to implement appropriate protection measures during the registration and profile-making process for children, thus infringing on the GDPR’s fairness principle.
The consequences of misleading features in TikTok’s registration process have posed a wide range of risks to underage users, as uncensored exposure to comments and interactions from strangers online open the doors for cyberbullying, online harassment, and access to inappropriate content.
The settings that TikTok had implemented during the time of this investigation allow child registrants to progress through the sign-up process in a way that categorized their accounts as public by default. This meant that any video posted by a child on the app was automatically shared to the public with unrestricted allowance of comments and reposting features. The DPC reached this conclusion after finding that when a child went to register for an account and came upon the privacy settings section of the registration process. They were nudged to opt for a public account by seeing a “Skip” button display on the right side of their screen. This influenced action would then make the child’s account public and open to interaction from any other user on the app.
The consequences of misleading features in TikTok’s registration process have posed a wide range of risks to underage users, as uncensored exposure to comments and interactions from strangers online open the doors for cyberbullying, online harassment, and access to inappropriate content. Considering how easily accessible it is for children to now download an app without parental consent, the responsibility to protect those children from these dangers falls more on the social media platforms than ever before. TikTok’s lack of transparency in their registration process failed to meet this standard.
Additionally, TikTok offers a feature for child users where their account may be paired to their parent or guardian’s account for an added layer of supervision and protection. However, the DPC found that TikTok did not have adequate verification methods in place to ensure that the accounts that children were being paired to were in fact derived from the child’s parent or guardian. This permitted children’s accounts to be paired with unverified accounts, thus enabling the non-child user to pull back on the child’s privacy rights by, for example, enabling direct messages for the child user.
TikTok has responded to this decision by proclaiming its commitment to engaging with regulators to ensure that all efforts are being exhausted to keep underage users off their platform. Age assurance is an industry-wide issue that all social media platforms are challenged with addressing. However, not many apps appeal to as broad of an age range of users as TikTok, making the threat of unrestricted child engagement even more dire.
The substantial size of this $370 million fine that the DPC dropped on TikTok is expected to serve as a powerful deterrent to all social media platforms. The European Union sent a strong message to TikTok that child safety violations will not be tolerated, while also drawing the attention of regulatory bodies worldwide. It will be interesting to see how other global leaders, such as the United States, move forward in placing checks on the data privacy practices of social media giants such as TikTok.
Simon Cawley is currently a 2L at the University of North Carolina School of Law. Simon is originally from Mount Airy, North Carolina and attended UNC Chapel Hill for undergrad, where he received degrees in Public Policy and Global Studies. In addition to his involvement with JOLT, Simon is most interested in criminal law.