The Children’s Online Privacy Protection Act, or COPPA, is a U.S. law that aims to protect the privacy and personally identifying information of children under the age of 13 who use online services. The law places rules on the use of data from and about children under 13 that are stricter than those governing data about older people, and offers parents the ability to monitor and approve some of the information their children share.
COPPA adds another distinct layer of privacy regulation that companies that traffic in personally identifying information need to deal with. Some sites attempt to avoid complying with COPPA by simply banning young users altogether; other sites may not consider themselves to be appealing to the under-13 set and therefore not subject to COPPA’s rules, but the FTC may take a different view based on a site’s content. While the law originated in the early days of the Internet, it’s even more important in the modern age of social media and programmatic ads. And an FTC COPPA settlement with Google in 2019 has resulted in major changes to how YouTube ads work, throwing the world of video creators into a major uproar.
COPPA can be traced back to a complaint in 1996 from the Center for Media Education about KidsCom, one of the internet’s first child-focused sites. The FTC investigated, and in 1997 issued its findings in a document that became known as the KidsCom Letter. KidsCom had been collecting data via registration forms, contest entries, and pen-pal programs, and the FTC found that the company violated FTC rules in a number ways when it came to handling that data. KidsCom was sharing that data with third parties (although the data was aggregated and not in the form of personally identifying information about individual children) and was doing so without informing the children or their parents.
KidsCom cleaned up its act in response to the letter, and remained a going concern until 2019. But in a subsequent report to Congress, the FTC presented evidence of a continued lack of compliance with privacy rules, and in 1998 Congress passed COPPA to give the agency more power to protect children’s privacy. While the FTC has updated is regulations to keep pace with changes to the online world, the basic framework of the law has remained more or less the same ever since.
Before thinking about complying with COPPA, you need to answer a question that turns out to be a little more complicated than it might seem at first: does the law apply to you? According to the FTC, you need to comply with COPPA’s regulations if:
The question of whether a site is “directed” at children under 13 is, of course, ambiguous. The FTC assesses sites based on a variety of criteria, including, in their words, “the subject matter of the site or service, visual and audio content, the use of animated characters or other child-oriented activities and incentives, the age of models, the presence of child celebrities or celebrities who appeal to kids, ads on the site or service that are directed to children, and other reliable evidence about the age of the actual or intended audience.” If under-13-year-olds aren’t your primary audience but your site still meets some of those criteria, you need to determine the age of individual users if you’re going to collect personalized data from them. Some websites age-screen their users so they don’t have to deal with COPPA regulations; for instance, many social networks, whose business model revolves around collecting and monetizing user data, set 13 as a minimum age for registered users.
The questions of what constitutes “collecting personal information” is also a complex one. Obviously the sorts of information that KidsCom collected, like children’s names and addresses, would count. But it also includes categories you might not expect. For instance, while contextual ads, which serve advertising based on the content of the website they appear on, don’t collect personal information under COPPA’s definitions, behavioral ads, which track user behavior across websites and apps, do. Even if those behavioral ads are being served by a third-party provider, if they’re on your website, you’re still responsible for them. Since behavioral ads are such a huge part of the internet ecosystem, this has enormous implications for child-directed websites.
Finally, keep in mind that even if your site isn’t directed at children, you have to follow COPPA’s rules if you have actual knowledge that you collect the personal information of under-13-year-olds. For instance, Yelp, a site nobody would think was very interesting to children, got into trouble with the FTC because it asked for users’ ages when they created accounts and allowed kids under 13 to sign up without following COPPA regulations when it came to handling their data.
One point of interest on this subject: some have argued that this “actual knowledge” standard encourages businesses to look the other way when it comes to young children using their site; after all, if Yelp had never asked for users’ ages when they signed up, their business probably wouldn’t have suffered any ill effects and they would have been free from the burden of dealing with users that they knew for certain were under 13. But an alternative “constructive knowledge” standard, in which sites are responsible for analyzing their own user base to determine if they have possible under-13-year-olds’ information, may be beyond the capacities of many small businesses.
Let’s say your website is subject to COPPA rules under the definition in the previous section. What do you need to do? The National Law Review has a good breakdown of the basics. You’ll need to:
You’re also forbidden from conditioning a child’s participation in an online activity on the child providing more information than is reasonably necessary to participate in that activity. In other words, you can’t make a kid hand over their mailing address or demographic information just to play a fun video game.
As is the case with many regulations, these rules are ambiguous and can be difficult to apply, and are frankly beyond the in-house ability of many businesses to implement. Recognizing this, COPPA created a Safe Harbor Program, under which industry groups or other organizations can submit self-regulatory guidelines to the FTC that implement COPPA’s rules. Once approved — and there are currently seven approved organizations — these groups can in turn certify websites as compliant with the law.
The website for TRUSTe, one of the approved Safe Harbor orgs, has a good breakdown of what that certification entails. The organization first reviews your website or service and delivers a report on your compliance (or lack thereof) with the law. They then help you remediate any problems and, when that process is completed, certify your site as COPPA compliant. The organization will subsequently offer monitoring and guidance going forward. All this comes at a price, of course. And the system can also be gamed in various ways: in a particularly brazen case, a Swiss game developer called Miniclip falsely claimed to be certified by a Safe Harbor organization for seven years.
One of the biggest impacts COPPA has had in recent years came after a multimillion dollar 2019 settlement between Google and the FTC over COPPA violations on Google’s YouTube site. In the wake of this agreement, Google shifted significant responsibility for COPPA compliance onto YouTube creators, who are responsible (and legally liable) for determining whether their individual videos or their channels as a whole are directed at children as defined by COPPA, although Google will also be using algorithmic techniques to seek out mislabeled videos.
This shift on Google’s part sent the community of YouTube’s children-directed video creators into turmoil. Videos marked as being for kids under 13 now cannot carry behaviorally targeted advertising, which cuts down ad revenue significantly, and cannot make use of YouTube features that require user login or user data, like comments, live chat, and the ability to save a video to watch later. While it’s still a potentially lucrative market, it’s one that’s now full of unexpected minefields.
As panic spread through the YouTube creator community after Google’s move, one number came up over and over again, being repeated in video after video: $42,530. That was the amount, so the rumor went, that the FTC would fine anyone caught falling afoul of COPPA’s rules. Since many creators are individual hobbyists who don’t make anywhere near that much in a year from their channels, it seemed like terrifyingly high number.
But as vidIQ explains, these worries are overblown. Yes, $42,530 is the current maximum fine per violation of COPPA. But maximum is the key word here, and the law explicitly states that the violator’s revenue guides the actual assessed fines. As vidIQ points out, a company called I-Dressup was found to have failed to adequately protect the personal information of 250,000 children and was fined a grand total of $35,000. While big fines are possible for big companies — in the settlement that started this whole business, Google paid $170 million — small creators almost certainly do not have to worry about being bankrupted.
Other than serving up personalized ads to kids on YouTube, what other sorts of things will get you in trouble with the FTC over COPPA rules? A trip through the FTC website demonstrates some moves to avoid: