Under the settlement, YouTube and Google will together pay $136 million to the Commission, by far the most significant COPPA penalty to date. Additionally, to settle allegations from the New York Attorney General, the companies are required to pay $34 million to the state of New York.
This landmark settlement follows barely six months after the FTC’s $5.7 million COPPA settlement with Musical.ly (now known as TikTok)—the largest at the time—and comes on the heels of an FTC announcement of plans to modernize COPPA to provide better protections for children online.
The settlement offers important lessons for operators of commercial websites, mobile apps or other online services that may collect or use children’s information.
Allegations Against YouTube and Google
In a complaint filed against YouTube and Google (the “complaint”), the FTC and the New York Attorney General alleged that the companies knowingly collected personal information from children under 13, used the information for behaviorally targeted advertising and failed to provide sufficient notice and obtain parental consent as required by COPPA.
COPPA applies to operators of commercial websites or other online services (such as mobile apps) that collect personal information from children under 13. Operators of online services must comply with COPPA if: (1) the service is directed to children under 13 or (2) the service is directed to a general audience and the operator has “actual knowledge” that it collects personal information from children under 13. Additionally, third parties (such as advertising networks) are subject to COPPA when they have actual knowledge that they are collecting personal information from users of an online service directed to children.
The FTC and the New York Attorney General alleged that YouTube and Google are subject to COPPA because they have actual knowledge that they collect personal information from viewers of YouTube channels directed to children under 13. According to the complaint, many popular channels feature content clearly directed to children, such as animated nursery rhymes and videos related to children’s toys (e.g., Barbie and My Little Pony). Furthermore, the complaint explains that YouTube and Google have actual knowledge that these channels are directed to children in part because:
- YouTube employees review content to determine which videos should be featured on the YouTube Kids App, which is aimed at children ages 2–12.
- In a number of instances, YouTube channel owners have expressly told the company that their content is directed to children under 13.
- YouTube has highlighted the popularity of many of its channels with kids in presentations to toy companies and other kids’ brands.
The complaint explains that, because YouTube and Google have actual knowledge that they are collecting personal information from users of child-directed channels, they are subject to COPPA.
The FTC and New York Attorney General alleged that YouTube and Google violated COPPA by collecting persistent identifiers from children under 13 without sufficient notice and parental consent. COPPA defines “personal information” to include persistent identifiers that “can be used to recognize a user over time and across different” online services and prohibits the use of persistent identifiers for any purposes other than internal website operations without appropriate notice and parental consent. According to the complaint, YouTube and Google collected and used persistent identifiers for targeted advertising in violation of these provisions.
In addition to imposing a fine, which totals $170 million, the proposed settlement order requires YouTube and Google to stop using, disclosing, or benefiting from personal information previously collected from users of child-directed channels. Additionally, among other requirements, the companies must implement a system for channel owners to identify their child-directed content so that YouTube can ensure COPPA compliance. YouTube must also notify channel owners that their content may be subject to COPPA and provide annual COPPA training to employees who communicate with channel owners.
The FTC’s investigation into YouTube’s privacy practices was aided by a complaint filed with the Commission by a coalition of more than 20 child advocacy and privacy groups in 2018.
Lessons Learned From YouTube-Google Settlement
The Commission’s settlement with YouTube and Google highlights the importance of understanding what aspects of a website or other online service trigger COPPA’s requirements. To minimize regulatory scrutiny, operators of websites or other online services should analyze whether COPPA compliance is required and, if not, develop a strategy to ensure the service does not come within COPPA’s scope in the future.
To determine whether COPPA applies to your website or online service:
- Identify what data elements (if any) your online service collects from users, either actively or passively. Be sure to consider data elements collected by any third parties that interact with your service, such as plug-ins or advertising networks, as well as information your service may be collecting from users of other websites or online services. COPPA defines “personal information” broadly—it includes identifiers such as name, address and phone number, as well as IP address, device serial number, certain geolocation data and even just a photograph. If you collect such personal information, COPPA requirements may apply.
- Actively assess whether any portion of your online service is directed to children under 13. Even if you post a statement that the service is not intended for children, the Commission could still deem it to be directed to children based on certain content, such as animated characters or images of child celebrities. If your online service is directed to children, you must treat all users as children under COPPA. In other words, you must implement COPPA protections as to every user—you may not screen users for their age and adjust your practices based on their response.
COPPA provides a narrow exception to the requirement that child-directed services treat all users as children for services with content directed to children but not targeting children as their “primary audience”—for example, websites directed to a mixed audience including children as well as parents and teens. Child-directed services that do not target children as their “primary audience” may age-screen users and tailor data collection practices accordingly, but they may not block children under 13. FTC guidance on this exception is limited and offers no clear guidelines. Accordingly, be sure to consult legal counsel if you believe your online service may qualify. - Consider whether you have actual knowledge that your online service is collecting data from children under 13. If you know you are collecting data from children under 13, you must comply with COPPA, regardless of whether the online service is directed to children. If you operate an online service directed to a general audience that screens users for their age before allowing entry, you must either: (1) block users under 13 from entering or (2) comply with COPPA in collecting personal information from those users. The Commission recommends that such age screens be designed in a manner that does not encourage children to lie about their ages.
- If you are collecting personal information from users of another online service, consider whether you have actual knowledge that the service is directed to children. If so, your online service will be deemed to be directed to children and COPPA requirements will apply. In determining whether you have actual knowledge that another online service is directed to children, consider direct communication you have had with the operator of the service and your review of content provided by the service. If the operator of the service has told you that the service targets children under 13, or if the service includes content clearly intended for children, you may be deemed to have actual knowledge that the service is child-directed.
If COPPA applies to a website or other online service, the service must take specific steps to protect children’s privacy, including by posting a child-specific privacy policy, directly notifying parents about the service’s data collection practices, obtaining verifiable parental consent to collect, use and disclose children’s personal information, and implementing safeguards to protect the security of children’s personal information.