Google has agreed to pay $8.25 million to settle a class-action lawsuit accusing the company of unlawfully collecting personal data from children under the age of 13, marking another high-profile moment in the intensifying global debate over digital privacy and corporate accountability.
The proposed settlement, disclosed on January 15, 2026, brings to a close a legal battle that has stretched over roughly two and a half years. The case was filed by the parents of six children who had downloaded popular kids’ racing games, including Fun Kid Racing and GummyBear and Friends Speed Racing, from the Android Play Store.
Those games were part of Google’s Designed for Families program, which promotes apps aimed at children and requires developers to follow the federal Children’s Online Privacy Protection Act, known as COPPA. Under that law, companies are prohibited from knowingly collecting personal data from children under 13 without verifiable parental consent.
Allegations tied to AdMob and family apps
According to the lawsuit, Google’s AdMob software development kit continued to gather data from children’s devices even after the company had allegedly removed the offending apps from its store. The parents claimed Google portrayed the family-friendly apps as COPPA-compliant while still allowing sensitive information to be siphoned from young users in the background.
The complaint argued that Google knowingly violated the law and misled parents by presenting the Designed for Families label as a safeguard. In vivid language, the filing accused the company of quietly extracting personal information from children who were simply playing games marketed as safe. Google did not immediately comment publicly when news of the settlement emerged.
The $8.25 million agreement surfaced on the same day a federal judge approved a separate $30 million settlement involving Google’s YouTube division. That earlier case, first brought in 2019, alleged that YouTube illegally collected children’s data, including IP addresses, geolocation details, and device identifiers, and used that information for targeted advertising.
Together, the two settlements underscore the growing legal risks facing technology companies whose business models rely heavily on data-driven advertising, particularly when minors are involved.
Privacy scrutiny extends beyond consumers
The timing of the Google settlement coincided with broader warnings about data practices far beyond children’s apps. Also on January 15, 2026, law firm Lewis Silkin released its Workplace Data Privacy Update, pointing to a sharp rise in regulatory enforcement and oversight of employee monitoring worldwide.
The report highlighted stricter scrutiny across the European Union and noted sweeping legislative reforms underway in jurisdictions including New Zealand, Chile, and India. These changes are reshaping expectations around cybersecurity, data governance, and lawful data use, with particular attention on developments linked to the EU AI Act and the Digital Omnibus.
Lewis Silkin emphasized that employers are increasingly expected to follow principles of transparency, proportionality, and data minimization. From background checks to biometric systems, organizations are being urged to justify why data is collected, limit how much is gathered, and clearly explain practices to workers.
For families, the Google cases reinforce long-standing fears that children’s personal information can be harvested without meaningful consent. For employees, the parallel rise in workplace monitoring raises questions about how closely daily activities can be tracked. And for companies, the message from regulators is becoming harder to ignore.
As governments around the world tighten privacy laws and enforcement, January 15, 2026, stands out as a signal moment. Whether involving a child tapping a screen to race a cartoon car or an employee logging into a corporate system, regulators are making clear that data protection is no longer a matter of policy statements alone, but of demonstrable compliance.
