In today’s hyperconnected world, it’s not uncommon for children to be more tech-savvy than the adults around them. But behind every cute dance video or viral meme lies a sophisticated system of data collection and recommendation algorithms, Many of which are now under scrutiny.
Recently, the UK’s Information Commissioner’s Office (ICO) launched formal investigations into three popular platforms; TikTok, Reddit, and Imgur, over concerns about how they handle the personal data of UK children aged 13 to 17. For educators and parents, this is a timely reminder: understanding the digital environments children are immersed in is no longer optional, it’s essential.
TikTok: Personalised, But At What Cost?
TikTok has become a fixture in many young people’s daily lives. Its algorithm seems almost magical in the way it recommends content. But the ICO is now investigating how that “magic” works when it comes to children’s personal information.
Is TikTok collecting too much data? Are its recommendation systems steering children toward inappropriate or harmful content? These are some of the key questions the ICO aims to answer.
This isn’t the first time TikTok has come under fire. In 2023, it was fined £12.7 million for unlawfully processing children’s data. The platform says it has since improved its safeguards, but the ICO is making sure those promises hold up under inspection.
Reddit and Imgur: How Do They Know a User’s Age?
Unlike TikTok, Reddit and Imgur aren’t in trouble for what they’re showing children, but rather how they determine whether someone is a child in the first place.
Currently, many platforms rely on self-declared age checks, systems that are easy for children to bypass. The ICO is now investigating whether Reddit and Imgur have adequate age verification in place to prevent underage users from accessing potentially inappropriate content.
Reddit has acknowledged the issue and says it plans to implement stronger age verification. Imgur has not yet commented publicly.
The Children’s Code: Why It Matters
These investigations are part of a broader regulatory effort called the Children’s Code, introduced in 2021. The Code sets out 15 standards that online services must meet to ensure children’s personal data is protected.
At its heart, the Code is built on a simple principle: what’s best for the child must come first.
That means:
- Minimising data collection
- Avoiding manipulative design
- Clearly explaining how data is used
- Ensuring privacy settings are age-appropriate by default
As adults, we play a critical role in helping children navigate digital spaces safely. The ICO’s investigations are an important step toward greater accountability, but regulation alone isn’t enough.
Here are a few ways you can help:
Start the Conversation
Ask children what apps they use and how they feel about them. Make tech a shared topic, not a private one.
Teach Critical Thinking
Encourage young people to question why they’re being shown certain content. What’s the platform hoping they’ll do next: watch more, click something, buy something?
Stay Informed
Keep up with digital safety guidance from trusted sources like the ICO, NSPCC, and Childnet.
Use Tools and Settings
Explore built-in safety and privacy controls on apps your child uses. These can often be customised to offer better protection.
The internet can be a wonderful place for learning, creativity, and connection. But it must also be a safe and respectful space for children. The ICO’s investigations send a clear message to tech companies: if you want to benefit from children’s attention, you must also earn their trust.
Let’s continue working together, as parents, teachers, and guardians, to ensure that the digital world treats our children with the care, respect, and dignity they deserve.