Children's Data Protection in 2026: GDPR, DSA, and Evolving Global Requirements
Children's data protection has moved from a niche compliance concern to a primary enforcement priority. In 2023 and 2024, several of the largest GDPR fines involved children's data. The EU Digital Services Act introduced new age assurance requirements. Multiple US states passed children's privacy laws. The trajectory is clear: stricter obligations, higher penalties, and technical requirements that many organizations are not yet meeting.
GDPR and Children's Data
GDPR provides specific protections for children's personal data in two places: Article 8 (consent for information society services) and Recital 38 (requiring "specific protection").
Article 8: Age of Digital Consent
- For information society services offered directly to children, GDPR requires parental consent for children under 16 (with member states able to lower this to 13). In practice:
- Germany: 16
- UK (pre-Brexit GDPR): 13
- Spain: 14
- France: 15
- Most other EU states: 16
This creates a patchwork that multinational services must navigate. The practical implication: any service that might be used by children must implement age verification sufficient to identify users who require parental consent, and must obtain that consent before processing their data.
The Consent Quality Problem
"Parental consent" is not satisfied by a checkbox. GDPR Article 8 requires "reasonable efforts" to verify parental consent "having regard to available technology." What constitutes reasonable effort has been contested.
The UK ICO's Age Appropriate Design Code (Children's Code) goes further: it requires that services default to high privacy settings for users who may be children, and prohibits "nudge techniques" that lead children to provide unnecessary data or consent.
Special Category Data
Health data, location data used to track movements, and data revealing communications are particularly sensitive when the subject is a child. Enforcement actions involving children have disproportionately focused on health data and location tracking.
Digital Services Act (DSA) Requirements
The DSA, which became fully applicable in February 2024 for large platforms and February 2024 for smaller services, introduces several children-specific obligations for online platforms:
Prohibition on Advertising to Children
Article 28 DSA prohibits platforms from presenting targeted advertising to minors (users they know to be under 18) based on profiling. This is a significant restriction — it covers any advertising that uses behavioral data, inferred interests, or demographic targeting.
Compliance requires: reliable age detection to identify minor users, separation of minor user profiles from targeted advertising systems, and monitoring for ad targeting that may reach minors.
Recommender System Obligations
Large platforms must offer at least one recommender system that is not based on profiling to all users. Minor users specifically must be able to use the service with recommender systems that prioritize safety and age-appropriate content.
Risk Assessment for Minors
Very large online platforms (VLOPs) must conduct annual systemic risk assessments including specific assessment of "any actual or foreseeable negative effects on the exercise of the rights of the child." Risk mitigation measures must be documented and audit-ready.
GDPR Enforcement Actions Involving Children
Several significant enforcement actions in 2023–2025 involved children's data:
TikTok: The Irish DPC issued a €345 million fine in 2023 for multiple violations including exposing children's accounts to public visibility by default, using dark patterns to push children toward public settings, and failing to verify ages adequately. The fine represented approximately 1.9% of TikTok's 2022 global turnover — a signal that DPAs are willing to use the full Article 83 range for children's violations.
Meta/Instagram: A €405 million fine from the Irish DPC in 2022 for Instagram's handling of children's accounts, including public-by-default settings for accounts of 13–17 year olds and display of phone numbers and email addresses in public profiles.
YouTube: The US FTC (outside GDPR) imposed a $170 million settlement with Google/YouTube in 2019 for COPPA violations — collecting personal information from children without parental consent. Under similar facts in the EU, GDPR fines could be substantially higher.
The pattern: default settings that expose children, inadequate age verification, and using children's data for behavioral advertising are the primary triggers.
US State Children's Privacy Laws
The US federal government has not passed comprehensive children's privacy legislation beyond COPPA (1998, amended 2013). But several states have filled the gap:
California (AADC/COPPA 2.0 elements): California Age-Appropriate Design Code Act (2022) — modeled on the UK Children's Code — requires covered businesses to complete a Data Protection Impact Assessment before offering products to children, default to high privacy settings, and avoid profiling children unless strictly necessary. Enforcement began July 2024.
Texas, Florida, and others: A wave of social media restrictions targeting minors (age verification for platform access, parental consent requirements) passed in 2023–2024. The constitutionality of some provisions is being challenged, but the legislative direction is clear.
Federal (KOSA): The Kids Online Safety Act has passed the Senate and faces ongoing House consideration. If enacted, it would impose safety duties on platforms and a duty to mitigate harms to minors including mental health, addiction, and privacy harms.
What Organizations Need to Do
Age Detection and Verification
Neither "we don't target children" nor "our ToS says 18+" satisfies the DSA and GDPR requirements. Organizations must implement reasonable technical age detection where children might use their services.
Approaches range from light (self-declaration with privacy-by-default protections for unverified users) to robust (ID document verification, credit card verification, age estimation from biometric data). The appropriate level depends on the risk profile of the service.
For most consumer internet services, the minimum acceptable approach is: detect signals that suggest a user may be a minor, apply child-appropriate defaults when uncertainty exists, and implement more robust verification for high-risk features (social graph exposure, advertising, recommendation systems).
Data Minimization for Child Users
Where children's data is processed, data minimization principles apply with extra force. Collect only what is strictly necessary for the service. Do not retain for analytics beyond immediate need. Do not use for advertising or profiling.
Privacy-by-Default Settings
DSA Article 28 and the UK Children's Code both require child users (or potential child users) to be protected by default. This means: most restrictive privacy settings as the default, no dark patterns that push toward less private configurations, and communications tools set to friends-only by default.
Data Subject Rights for Children
Children can exercise GDPR rights directly (subject to age/capacity) or through their legal guardians. Access requests, erasure requests, and rectification requests must be honored. For platforms used by children, DSAR processes must accommodate requests made by guardians.
Special consideration for erasure: data collected from users who were minors at the time of collection faces heightened erasure expectations. Multiple DPAs have indicated that data collected from children during their minority should generally be erased upon request regardless of the legal basis used for original collection.
Processing Agreements and Sub-processors
If you use third-party services that process data of your child users — analytics platforms, advertising networks, CDNs — those sub-processors must be bound by equivalent protections. Passing children's data through unrestricted advertising networks is not compatible with GDPR or DSA requirements, regardless of what your own privacy policy says.
The Anonymization Angle
For analytics, product research, and AI training, anonymizing children's data before use is not just good privacy practice — it is the compliance-minimum approach that avoids the consent and legal basis complications.
Children's data is inherently special category in practice (if not always in legal classification): re-identification risks are higher because children's behavioral patterns are more distinctive and their public profiles are typically smaller, reducing the population of potential matches. Anonymization thresholds should be higher for datasets containing children than for adult-only datasets.
Looking Ahead
The regulatory trajectory for children's data protection is toward stricter requirements and higher penalties. The DSA systemic risk assessments will begin producing audit findings in 2025–2026. US state laws are proliferating. GDPR enforcement on children-related violations has demonstrated willingness to issue fines at the upper range of Article 83.
Organizations that have not separately assessed their children's data processing — and treated it as distinct from their general personal data processing — face increasing exposure. The "we comply with GDPR generally" answer is not sufficient when the question is specifically about children.
Comments (0)