ICO - Information Commissioner's Office

03/11/2026 | Press release | Distributed by Public on 03/11/2026 18:03

Open letter issued to tech firms to strengthen age checks and protect children’s data

We have today published an open letter to social media and video-sharing platforms operating in the UK, calling on them to strengthen age assurance measures so young children can't access services that are not designed for them.

The open letter sets out our expectations that platforms with a minimum age must move beyond relying on children to self-declare their ages, which they can easily bypass.

Instead, platforms should make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services.

We have also written directly to platforms, starting with TikTok, Snapchat, Facebook, Instagram, YouTube and X to ask them to demonstrate how their age assurance measures meet these expectations. 

"Our message to platforms is simple: act today to keep children safe online. There's now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place.

"Platforms need to be ready to demonstrate what they're doing to keep underage children out and safeguard those children that are old enough to access their services."

- Paul Arnold, ICO Chief Executive Officer

This call to action forms part of the next phase of our Children's code strategy, which has already made significant progress in improving children's privacy standards across social media and video-sharing platforms, but we want companies to go further on age assurance. Platforms must be able to tell which users are children so they can benefit from the protections they're entitled to.

We recently fined Reddit £14.47 million and MediaLab (owner of Imgur) £247,590 for failing to implement age-assurance measures and for processing children's personal information unlawfully in a way that potentially exposed children to inappropriate, harmful content.

We also remain concerned about how social media and video-sharing platforms process children's data to generate recommendations, especially when this leads to harmful content or increases the risk of addiction to platforms. In March 2025, we opened an investigation into TikTok's processing of children's data in its recommender systems. In December 2025, we requested information from Meta about the processing of children's data on Instagram's recommender systems.

Protecting children online requires coordinated action across the regulatory system. We continue to work closely with Ofcom, which enforces the Online Safety Act.

Both regulators will publish an updated joint statement in March 2026, which outlines the main areas of interaction between online safety and data protection as they relate to age assurance.

We also supports Ofcom's call today for platforms to enforce minimum ages and make sure their algorithms are configured to prevent children from encountering harmful content.

ICO - Information Commissioner's Office published this content on March 11, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on March 12, 2026 at 00:03 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]