Child Safety OnlineJan 15, 2020

Everything You Need to Know About YouTube’s Child Privacy Updates

Everything You Need to Know About YouTube’s Child Privacy Updates

As part of their settlement with the FTC, YouTube promised changes to protect kids’ online privacy.

YouTube made some big news back in September 2019 when the FTC slapped it with a $170 million fine. This record-setting amount was handed down because the video streaming platform was alleged to be violating the Children’s Online Privacy Protection Act (COPPA). Though YouTube wasn’t the first company to receive a multi-million dollar fine from the FTC for violating kids’ online privacy, this was certainly the largest settlement to date—and it also came with a promise from the platform to make broad changes to the way they handle kid’s content.

At first, YouTube strongly considered moderating all content across the platform, but decided against it at the last minute. If they had gone this route, it would have fundamentally changed the nature of the platform: they would no longer be a “neutral” space where everyone could upload content. Rather YouTube would have more of a programming role, and with that comes increased scrutiny, liability and risk.

Rather than moderating all content, YouTube has opted for the changes outlined below. While they should improve child privacy on the platform, many of the new requirements shift the responsibility (and liability) to those creating the content. And, the changes bring lots of uncertainty for content creators who make videos for kids.

Scaling back on data collection

Beginning on Monday, January 13th, YouTube says it will limit the data collected from users who watch children’s videos, no matter what age they are. Under COPPA, websites and apps that serve children aren’t allowed to collect personal information from users under 13 without parental consent. Until the FTC investigation last year, YouTube claimed it was not a platform for kids by pointing to the 13+ age restrictions in its Terms of Service. But, the fact that kids were using the service in great numbers was the worst-kept secret in tech—and there was plenty of evidence to suggest that advertisers and content creators alike were targeting children. So, this new change acknowledges and protects the under-13 segment that makes up the audience for these videos. In addition, there are a few other community features tied to personal data and those will be turned off on children’s videos as well. They include commenting, live chat and the ability to save videos to a playlist.

Get the latest insights in children’s online privacy—straight to your inbox.

It’s also significant that YouTube is limiting data collection on anyone who watches a kids’ video, even if the account belongs to an adult. In the past, YouTube operated under the assumption that everyone using their platform was an adult—and the FTC essentially let them. But now, they’ve had to swing to the other extreme, and they are putting all of the onus on content creators to define “children’s content.” If YouTube had taken on this responsibility itself, that would make them a programmer, fundamentally changing their business and subjecting them to new regulations.

YouTube says it will no longer allow targeted ads on children’s content—because that necessarily requires the kind of data collection that’s not allowed under COPPA without verifiable parental consent. In theory, you’ll no longer be served advertisements based on your browsing history or online activity when watching a kids’ video. YouTube says it will instead show advertisements based on the video’s context, which means advertisers and content creators will be navigating a brand new landscape. This will make it very, very difficult for major advertisers to target young audiences, and have a significant impact on content creators’ revenue. YouTube also says that users who watch a kids’ video will now be more likely to see recommendations for other kid’s content.

New rules for content creators

Way back in September, YouTube signaled that content creators on the platform would need to start designating their videos as “made for children” or “not made for children,” and they introduced a new setting a few months later that helped creators flag kids’ content. But, they’re also using artificial intelligence to help identify videos targeted at kids and the platform can override a content creator’s designation if they see fit.

Additionally, channel branding watermarks, cards on end screens of videos and channel memberships will all be disabled on kids’ content. And, creators will no longer be able to accept donations on children’s videos.

And it’s not just YouTube that content creators need to satisfy. The FTC could be targeting creators directly if they fail to indicate that their content is made for children—to the tune of $42,530 per offending video.

Important changes, long overdue

We’ve discussed before how YouTube is the number one brand among children. And that’s YouTube, not YouTube Kids, which ranked as the 50th most popular brand in 2019. YouTube is also populated with channels that appeal to kids, like ChuChu TV and Cocomelon that have 30.7 million and 69.8 million subscribers respectively. On top of that, YouTube’s top-grossing channel for 2018 and 2019 was Ryan’s World (formerly Ryan ToysReview), which stars a seven-year-old boy. The channel brought in an estimated $22 million in 2018 and $26 million in 2019. Given all this, it’s clear why YouTube couldn’t get away with claiming they’re not a platform for kids. And since they are the most beloved platform among children, it’s that much more important that they comply with COPPA.

Other platforms and companies have been fined for violating COPPA, including video sharing app TikTok and mobile advertiser InMobi, but YouTube is by far the biggest to date. Until last year, it was simply doing what many other tech companies did: avoiding COPPA compliance by claiming they weren’t a platform for kids.

The FTC fine itself was $170 million, which is a relatively minuscule amount compared to the $38.9 billion quarterly revenue earned by YouTube’s parent company Alphabet. But along with that fine came a promise from the platform to update its provisions for child online privacy.

Children rely on the rest of us to ensure that they aren’t taken advantage of online, and it’s crucial that legislation like COPPA is enforced. And, now that the number one most popular platform among children has been held to account, we’re hoping that others will begin to follow suit.


We use cookies on our Website to improve your browsing experience, to improve the performance and to gather statistics. If you wish to know more about cookies, how to prevent the installation of these cookies and change the settings of your browser, click here.