TikTok hit with complaint from child privacy advocates who say it’s still flouting the law

A group of child privacy advocates has filed a complaint against TikTok with the Federal Trade Commission, claiming the video app violated an agreement to protect children on its platform, The New York Times reported. 

TikTok paid a $5.7 million fine to the FTC in February 2019 over allegations that an earlier version of its app, called Musical.ly, violated the Children’s Online Privacy Protection Act (COPPA) by allowing users younger than 13 to sign up without parental consent. Under the terms of the agreement, TikTok also agreed to remove all videos previously uploaded by anyone under the age of 13.

But the coalition of privacy advocates, led by the Center for Digital Democracy and the Campaign for a Commercial-Free Childhood, found videos from 2016 posted by children under 13 were still on the app and that the company doesn’t do enough to obtain parental consent of new users. 

“TikTok fails to make reasonable efforts to ensure that a parent of a child receives direct notice of its practices regarding the collection, use, or disclosure of personal information,” the complaint states. “Indeed, TikTok does not at any point contact the child’s parents to give them notice and does not even ask for contact information for the child’s parents. Thus, TikTok has no means of obtaining verifiable parental consent before any collection, use, or disclosure of children’s personal information as required by the consent decree and COPPA rule.”

Even its service specifically designed for children under 13 is problematic, the complaint states; TikTok for Younger Users, which limits what users can post simply “incentivizes children to lie about their age.” A child who registers for the “Younger” version could cancel that account then reregister for a standard TikTok account on the same device just by altering their birthdate, according to the complaint. 

That is not necessarily a problem unique to TikTok, and it’s not the only platform to run into issues about how it handles content from young children. In September, the FTC fined YouTube $170 million over its alleged abuse and collection of children’s data in violation of COPPA. YouTube introduced a new labeling system for creators who focus on children’s content as a result. 

“Congress empowered the FTC to ensure that kids have online protections, yet here is another case of a digital giant deliberately violating the law,” Jeff Chester, executive director of the Center for Digital Democracy, said in a statement. “The failure of the FTC to ensure that TikTok protects the privacy of millions of children, including through its use of predictive AI applications, is another reason why there are questions whether the agency can be trusted to effectively oversee the kids’ data law.”

When it was rolled into TikTok in August 2018, Musical.ly had 100 million active monthly users. TikTok has more than 500 million users worldwide, many of whom are children. In April, analytics platform Sensor Tower reported TikTok had been downloaded globally 2 billion times. 

Also in April, TikTok introduced a new feature, called Family Pairing, which allows parents to link their kids’ accounts to their own, giving access to disable direct messages, turn on restricted content mode, and set screen time limits.

“We take privacy seriously and are committed to helping ensure that TikTok continues to be a safe and entertaining community for our users,” a TikTok spokesperson said in an email to us.

Leave comment

Your email address will not be published. Required fields are marked with *.