02/04/2026 | Press release | Distributed by Public on 02/05/2026 06:00
We have fined MediaLab.AI, Inc. (MediaLab), owner of image sharing and hosting platform Imgur, £247,590 for failing to use children's personal information lawfully.
The penalty follows an investigation that found MediaLab allowed children to use Imgur without putting in place the basic safeguards required under UK data protection law.
We concluded MediaLab breached the law by:
Personal information often drives the content children see online. MediaLab had no way of knowing the age of Imgur users, meaning that children were at risk of being exposed to harmful content on the platform, including content related to eating disorders, homophobia, antisemitism and images of a sexual or violent nature.
John Edwards, UK Information Commissioner, said:
"MediaLab failed in its legal duties to protect children, putting them at unnecessary risk. For years, it allowed children to use Imgur without any effective age checks, while collecting and processing their data, which in turn exposed them to harmful and inappropriate content.
"Age checks help organisations keep children's personal information safe and not used in ways that may harm them, such as by recommending age-inappropriate content.
"This fine is part of our wider work to drive improvements in how digital platforms use children's personal data. Ignoring the fact that children use these services, while processing their data unlawfully, is not acceptable. Companies that choose to ignore this can expect to face similar enforcement action."
Investigation findings and enforcement action
Our investigation found that between September 2021 and September 2025, MediaLab processed the personal information of children using Imgur in ways that breached the UK GDPR.
UK law says that online services using the personal information of children under 13 can only rely on the lawful basis of consent if consent is given by the child's parent or carer.
Imgur's terms stated that children under 13 could only use the platform with parental supervision. However, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform.
Under the law, we can issue fines of up to £17.5 million or 4% of an organisation's annual worldwide turnover, whichever is higher.
In setting the £247,590 penalty amount, we took into consideration the number of children affected by this breach, the degree of potential harm caused, the duration of the contraventions, and the company's global turnover. We also considered MediaLab's acceptance of our provisional findings set out in the Notice of Intent issued in September 2025 and its commitment to address the infringements if access to the Imgur platform in the UK is restored in the future. If MediaLab resumes processing the personal data of children in the UK without implementing the measures it has committed to, we may take further regulatory action.
We are considering the redaction of personal and commercially confidential or sensitive information ahead of publishing the monetary penalty notice.
ICO's role and remit in protecting children online
We are the UK's independent regulator for data protection, and safeguarding children's privacy online is a priority.
UK data protection law says children should be given special treatment when it comes to their personal information. Our Children's code (also known as the Age Appropriate Design Code) translates the legal requirements into design standards for online services likely to be accessed by under-18s, helping organisations understand what is expected of them. That includes putting children's best interests at the forefront and giving them a high level of privacy by default.
In December 2025, we reported strong progress on its Children's code strategy, including a proactive supervision programme to drive improvements in how social media and video sharing platforms handle children's data.
Age assurance advice for online services
Age assurance tools act as a guardrail to prevent children from accessing online services they shouldn't be using or to help platforms tailor their online experience accordingly.
These tools can form part of a proportionate approach to reducing the data risks children face online and supporting conformance with the Children's code.
To support them in tailoring an age-appropriate experience, organisations should match the age assurance method they use to the level of risk on their platform. Organisations can either apply the full protections of the Children's code to all users or use proportionate age assurance tools to tailor safeguards by age.
Where children under a certain age are not allowed to use a service, organisations must focus on preventing access and enforce their minimum age requirements using robust age assurance methods.
Further guidance is available in the ICO's age assurance opinion.