Children’s Data Update: From FTC and State Agency Enforcement to KOSA Advancing the Senate
Authors
Myriah V. Jaworski , Ilya Smith , Chirag H. Patel
On July 7, the FTC and the Los Angeles District Attorney’s Office filed a complaint in the United States District Court for the Central District of California against the developer of an anonymous messaging app (NGL Labs, LLC) and its two co-founders, alleging that the app deceptively and unfairly marketed to and collected the personal information of children and teens (children’s data).
The writing has been on the wall for enforcement actions against apps and websites that target children in their marketing or data collection practices. The FTC identified children’s data as an enforcement priority for 2024, and in December 2023 issued a notice of proposed rulemaking to strengthen the Children’s Privacy Rule under the Federal Children Online Privacy Protection Act (COPPA).
State legislatures and agencies have also made children’s data an enforcement priority. The FTC’s action against NGL Labs follows shortly after the May 2024 proposal of the California Children’s Data Privacy Act (CPDA), and the California Attorney General’s June 2024 $500 Million settlement of a claim that another app developer (Tilted Media, LLC) improperly collected and shared children’s data.
Below we discuss the FTC’s recent complaint against NGL Labs and the future of enforcement actions related to children’s data in California.
The NGL Labs Complaint Provides the FTC’s Enforcement Playbook for Apps Targeting Children and their Data
The FTC claims that NGL Labs: (1) actively marketed their service to children and teens, (2) “falsely claimed that its AI content moderation program filtered out cyberbullying,” and (3) sent “fake messages that appeared to come from real people and tricked users into signing up for their paid subscription by falsely promising that doing so would reveal the identity of the senders of messages.” FTC Chair Lina M. Khan said, “NGL marketed its app to kids and teens despite knowing that it was exposing them to cyberbullying and harassment.”
Based on these core claims, the FTC alleged deceptive practices in violation of the FTC Act, unfair marketing practices in violation of the FTC Act, violation of the Restore Online Shoppers’ Confidence Act (ROSCA), violation of the COPPA children’s privacy rule, and violations of the California Business and Professions Code § 17200.
The FTC’s proposed order, if confirmed by the district court will require NGL to:
- Pay a $5 million fee
- “Implement a neutral age gate that prevents new and current users from accessing the app if they indicate that they are under 18”
- “Delete all personal information that is associated with the user of any messaging app unless the user indicates they are over 13 or NGL’s operators obtain parental consent to retain such data”
- Not make similar representations about senders of messages, its AI content moderation capabilities, or its subscription practices; and
- “Obtain express informed consent from consumers prior to billing them for a negative option subscription, provide a simple mechanism for canceling any negative option subscriptions, and send reminders to consumers about negative option charges.”
The NGL Complaint provides the likely framework for enforcement actions against apps and websites that market to or collect children’s data. Moreover, the NGL complaint, especially in light of the express prioritization of enforcement actions around the alleged misuse of children’s data, demonstrates the importance of exercising best practices when an app or website’s audience engages with children.
The Future of California Enforcement Actions Involving Children’s Data
On June 18, the California Attorney General (AG) announced a $500,000 settlement with Tilted Media, the developer of the popular mobile app game “SpongeBob: Krusty Cook-Off,” based on a claim that the app developer violated the California Consumer Privacy Act (CCPA) and COPPA for collecting and sharing children’s data without parental consent.
Specifically, the California AG pointed to the age verification screen that “did not ask age in a neutral manner, meaning children were not encouraged to enter their age correctly,” and the incorrect configuration of third-party software development kits (SDKs) that allowed the collection and sale of kids’ data without parental consent.
The settlement also included injunctive relief requiring the app developer to “ensure legal data collection and disclosure, including obtaining parental consent and diligence in configuring third-party software in their mobile games.” Beyond the standard injunctive relief that provides for a blanket prohibition against violating statutes, the specific injunctive relief in the settlement additionally requires:
- In instances where the developer “sells or shares the personal information of children, provide a just-in-time notice explaining what information is collected, the purpose, if the information will be sold or shared, and link to the privacy policy explaining the parental or opt-in consent required.”
- Implementation of neutral age verification screens.
- Implementation of an “SDK governance framework to review the use and configuration of SDKs within its apps” to ensure compliance.
Notably, the AGs recognized the SDK misconfiguration that led to the collection of children’s data in violation of COPAA was inadvertent but still found Tilted Media responsible. While well-intended, deploying misconfigured SDK and other governance tools begets claims of noncompliance. While it works for home appliances, a “set it and forget it” approach to children’s data compliance will not shield app developers from claims nor offer a safe harbor.
California consumers—and other states and enforcement agencies—are watching California as it continues to lead the way state-side in expanding privacy protections. If passed, the California Children’s Data Privacy Act (CDPA) will provide even more enforcement opportunities for California regulators. The proposed legislation would amend the California Consumer Privacy Act (CCPA) in a couple of keyways impacting the collection and use of Children’s Data. The current version of the CCPA requires “actual knowledge” that the consumer is under 13 or 16 years old. If the business knows the consumer is under 16, the business must obtain affirmative authorization to sell or share their personal information. If under 13 years old, a business must obtain affirmative authorization from the consumer’s parent or guardian. The proposed CDPA makes two key changes to these provisions.
First, the proposed CDPA removes the “actual knowledge” standard. Implicitly this would require every app or website within the purview of the CCPA to affirmatively verify the age of every user to determine if they are under 18. Second, the proposed CDPA expands the restrictions on (i) the selling or sharing of children’s information, and (ii) the collection, use, and disclosure of that data. Accordingly, any form of collection, use, or disclosure of children’s data for which a company did not previously obtain affirmative consent (or affirmative parental consent) would require obtaining the appropriate level of consent. If passed, companies subject to the CDPA will need to audit their customer rosters and current marketing plans to confirm that it has secured appropriate consents to process personal information and companies will need to implement other technical changes like consent management tools that will be needed to comply with the new CDPA requirements.
What’s Next for Child’s Privacy Initiatives on the Hill?
The flurry of increased state regulation aimed at protecting consumer privacy and more recently children’s data and safety created the conditions that recently led to the U.S. Senate passing a pair of bills aimed at protecting children on the internet. In a landslide vote of 91-3, the U.S. Senate passed what is being called COPPA 2.0. On July 30, the Senate passed a pair of bills that would allow both federal and state governments to investigate and take action against digital platform practices that exploit or hurt children and would also increase the age of minors protected under the existing federal Children’s Online Privacy Protection Act of 1998 (COPPA). COPPA, a product of the government’s response to the risks posed in the early internet era before the advent of the current social media and gaming platform offerings, imposes requirements on website operators and online services directed to protect children under 13 years of age. The U.S. House of Representatives now must determine whether it will coalesce to take up parallel legislation in this area. In an election year, children and family advocates pushing for a federal solution in children’s internet safety and privacy are loud.
If passed, KOSA would be the first major reform to minor data regulation since the original COPPA was enacted in 1998. KOSA’s passage would bring significant changes to existing COPPA compliance, including its ban on targeted ads and nonconsensual data collection with minors, and actual knowledge standards. Other KOSA requirements such as privacy-by-default, risk audits will present businesses with additional compliance challenges.
As businesses evaluate their minor data collection and consent practices, it is important to include regulatory enforcement and KOSA considerations, including with the support of sophisticated outside counsel.
This publication is intended for general informational purposes only and does not constitute legal advice or a solicitation to provide legal services. The information in this publication is not intended to create, and receipt of it does not constitute, a lawyer-client relationship. Readers should not act upon this information without seeking professional legal counsel. The views and opinions expressed herein represent those of the individual author only and are not necessarily the views of Clark Hill PLC. Although we attempt to ensure that postings on our website are complete, accurate, and up to date, we assume no responsibility for their completeness, accuracy, or timeliness.