On the 19th of June 2025, the Data Use and Access Act (commonly known as the DUAB or DUA Bill) officially received Royal Assent, marking a significant update to the UK’s data protection framework. While it doesn’t entirely rewrite GDPR, it does bring a number of key changes that educational institutions need to be aware of.

Let’s break it down and explore what this means for colleges, schools, and educational staff in a practical, easy-to-digest way.

Subject Access Requests: The Clock Pauses for Clarification

We know that managing Subject Access Requests (SARs) can be time-consuming, especially when students or staff request a mountain of data.

Here’s what’s changed:

  • If someone submits a SAR and you need clarification, the response deadline is paused until they respond.
  • You can reasonably ask for clarification if you hold a large amount of data about the person.
  • Once clarification is received, you must still perform a reasonable and proportionate search for the requested data.

Takeaway for educators: If a student or staff member submits a vague or broad SAR, you now have the legal room to ask for specifics before the 30-day countdown begins. This helps you stay compliant and focused on what’s actually needed.

Complaints Must Be Handled Internally First

The new law gives organisations a chance to fix things before complaints go to the ICO (Information Commissioner’s Office).

Now, if someone raises a data protection issue:

  • You must acknowledge their complaint within 30 days
  • Take appropriate steps to investigate and address it
  • And inform them of the outcome

What this means: Colleges should ensure there’s a clear internal complaint process. This is a great time to check that your staff know how to escalate issues internally, before they become a bigger (and more public) problem.

Automated Decision-Making & AI in Education

With AI becoming a growing tool in education, think automated exam marking or application screening, DUAB clarifies how these systems should be handled:

Automated decisions that have legal or significant effects (e.g. admissions, grading) are permitted, as long as:

  • The individual is told their data will be used this way,
  • They are given the option to contest the decision, and
  • Human intervention is available.

Heads up: If your college uses AI to grade tests (like automated multiple-choice scoring), or any automated student decision-making systems, let your data protection lead know. You’ll need to be clear with students about their rights and the role AI plays.

Cookies: A Bit More Flexibility

Let’s talk cookies. No, not the chocolate chip kind.

DUAB introduces exemptions to the consent requirement for certain non-essential cookies. That includes cookies used for:

  • Statistical analysis
  • Website performance
  • User preferences
  • Service improvements

Good news for IT teams: If your college’s website uses cookies only for these purposes, you no longer need to obtain user consent.

Legitimate Interests: A Clearer Path Forward

There’s now a recognised list of “legitimate interests”, making life a bit easier for data protection leads. These include:

  • Safeguarding students and staff
  • Fraud prevention
  • Network and system security
  • Intra-group data sharing (for multi-site colleges)
  • Direct marketing

If your college is processing data for any of the above reasons, you no longer need to complete a Legitimate Interests Assessment (LIA).

Bottom line: This streamlines processes and reduces paperwork when handling security or safeguarding-related data.

What’s Next?

While these updates are now law, the government hasn’t yet published education-specific guidance. In the meantime:

  • Review your current policies, especially around SARs, complaints, and automated decision-making
  • Update your internal processes and staff training to reflect the new rights and obligations
  • Monitor for upcoming sector-specific guidance

If you’d like to explore more detailed legal interpretations or dive deeper into how DUAB changes things across different industries, click here for further information.

Let’s continue to champion responsible data use in our schools and colleges. These changes aim to strike a better balance between protecting individuals’ rights and reducing unnecessary red tape – and that’s something we can all get behind.

Picture this, a student who left school years ago reaches out with a request. They’d like their personal data removed from your records. Not just from the admissions system, but everything; emails, reports, maybe even that long-forgotten incident log from Year 9. They’ve moved on and want their past to do the same.

It’s a request that might feel particularly familiar as we approach the end of the academic year. Leavers are saying their goodbyes, staff are wrapping up reports, and across schools and universities, inboxes are starting to fill with data requests of all kinds. One that stands out: the Right to Be Forgotten.

Under the UK GDPR, individuals have the “Right to Be Forgotten,” meaning they can request the deletion of their personal data when it’s no longer necessary. On the surface, it’s empowering. People should have a say in how long their data is kept, and it makes sense that someone might want to shed parts of their past as they grow older.

But in education, where records aren’t just about individuals, they’re about safeguarding, progress, accountability, and shared experiences, it’s not always a straightforward decision.

When Forgetting Isn’t That Simple

Educators are natural record-keepers. Whether it’s for safeguarding, special educational needs, or even just tracking progress over time, information often needs to be preserved for years, sometimes decades. Deleting a student’s file isn’t like clearing out an old inbox, it could mean losing context, closing the door on long-term support, or even creating gaps that matter later on.

Take safeguarding, for instance. A school may need to retain certain records well into adulthood because they could one day form part of a disclosure or investigation. It’s not about holding on to the past unnecessarily, it’s about being prepared for the future.

So when a deletion request lands, it’s not simply a matter of yes or no. It’s about finding the balance between respecting someone’s right to move on, and the very real need to preserve records for legal, ethical, or pastoral reasons.

The Growing Digital Trail

Another challenge? The sheer amount of digital space education now occupies. Emails, learning platforms, CCTV, behaviour tracking software, it all adds up. And each of those systems might hold fragments of someone’s personal story.

In practice, this means schools and universities need to have a clear idea of where data lives, how long it needs to be kept, and when it can safely be deleted. That’s easier said than done, especially when older systems weren’t designed with the right to erasure in mind.

Still, we’re seeing encouraging progress. A number of schools have created systems for reviewing and safely deleting old data, ensuring they don’t keep more than they need to. Some universities have designed workflows that let them honour deletion requests without compromising core records like transcripts or misconduct reports. What’s key is having a clear, fair process and communicating it openly.

Conversations, Not Just Policies

The most effective responses to RTBF requests tend to start with a conversation, not a policy document. The best examples we’ve seen involve staff explaining clearly why some records must be kept, while also doing everything they can to remove unnecessary data. It’s about approaching each request with care and respect, rather than defensiveness or bureaucracy.

That human touch really matters. After all, people often ask to be “forgotten” not just out of privacy concerns, but because they want to close a chapter, sometimes after difficult experiences. Even when we can’t grant a full erasure, we can usually find ways to meet the spirit of the request with kindness and clarity.

So, Where Does That Leave Us?

The Right to Be Forgotten is both powerful and complex. It asks important questions about memory, responsibility, and care. And while it might feel tricky to navigate, it’s also a chance for schools and universities to reflect on the data they hold and why they hold it.

Are we keeping things because we need to? Or because we always have? Are we making space for people to move on? Or unintentionally anchoring them to a version of themselves they’ve long outgrown?

There are no perfect answers, and each case will be a little different. But what’s clear is that thoughtful, transparent processes and a bit of empathy go a long way. Forgetting, when it’s done well, can be an act of respect. And remembering, when necessary, can be an act of care.

If nothing else, it’s a good reminder that every bit of data we collect today might become part of someone’s story tomorrow. How we hold that story, and when we let it go, matters more than ever.

In a quiet primary school on the edge of town, a well-meaning teaching assistant printed a spreadsheet containing student health details. He left it in the staffroom just for a moment while making a cup of tea. When he returned, the document had vanished. It later turned up, crumpled and smudged with jam, in the Year 2 book corner.

No hackers, no malware. Just a simple, very human mistake.

When we hear “data breach,” it’s easy to picture a dramatic cyberattack. Dark rooms filled with glowing screens, stern-faced experts speaking in code, maybe even a ransom note demanding cryptocurrency. But in schools, breaches often take a much quieter, more familiar form. They’re the result of everyday errors and oversights, the kind we don’t always see as serious, even when they are.

In educational settings, personal data flows constantly. From pupil records and safeguarding notes to staff files, medical information, and assessment data – it’s all there, quietly underpinning the day-to-day rhythm of school life. And while schools have come a long way in formalising policies and investing in secure systems, there’s one area where vulnerability lingers: the forgotten, overlooked breaches.

Sometimes, it’s about destruction rather than disclosure. Take the school secretary who accidentally deletes a folder containing behavioural reports during a routine system tidy-up. Or the teacher who loses a handwritten safeguarding log in a spring clean, thinking it was scrap paper. It’s easy to assume that if no one else accessed the data, there’s no breach. But loss and destruction, intentional or not, can have a serious impact on the individuals that data relates to.

Alteration is another blind spot. It might seem harmless when a colleague “tweaks” a student’s medical note for clarity or edits a pastoral record without documenting the change. But even small, unauthorised edits can lead to confusion or worse, misinformed decisions. One school found itself in difficulty after a student’s allergy record was edited to say, “mild intolerance” rather than “anaphylaxis.” A well-meaning change, but a deeply risky one.

And then there’s unauthorised access, often accidental but nonetheless serious. A casual conversation in the corridor mentioning a child’s social services involvement. A screen left unlocked during a lunchtime rush. A well-intentioned parent volunteer glimpsing confidential student data while helping out in the office. No malicious intent, but the boundaries blur all the same.

Loss of data is just as often physical as it is digital. Laptops go missing, USB drives slip between sofa cushions, old filing cabinets are emptied into bins without checking the contents. One secondary school discovered that its entire archive of paper attendance logs had been mistakenly shredded during a storage room clear-out. Years of data, gone in an afternoon.

These are not stories of negligence or malice. They’re stories of busy people juggling multiple responsibilities, making small decisions in a fast-paced environment. But these small moments, multiplied across a school, can create quiet cracks in data protection, cracks where trust can quietly seep away.

So, how do we raise awareness without creating fear?

First, we shift the way we talk about data breaches. Instead of painting them as rare and technical, we frame them as something all of us have a role in preventing. This isn’t about panicking over paperwork, it’s about embedding a culture of care, where data is treated with the same respect as student safety or wellbeing.

Telling real, relatable stories helps. A GDPR workshop might prompt yawns but a discussion about how a misdirected email affected a vulnerable student lands very differently. It’s not about ticking boxes; it’s about protecting relationships.

Second, we make it safe for staff to report near misses. Too often, people worry about being blamed or embarrassed. But a culture of openness turns mistakes into opportunities to learn and improve. A teacher who admits to accidentally sharing the wrong document helps the whole school get better.

And finally, we remember that leadership sets the tone. When senior staff prioritise data protection not just as compliance but as a matter of integrity, it filters down. When they model good habits such as locking screens, questioning poor practices, inviting feedback; it becomes part of the school’s DNA.

In schools, trust is everything. Families trust staff to keep their children safe, not just physically, but emotionally and digitally too. That trust isn’t only built through grand gestures; it’s upheld in the smallest details. How we store, share, and respect the information we hold.

Because in the end, a breach doesn’t have to be loud to be damaging. It can be a file lost, a conversation overheard, a folder carelessly edited. And protecting against those quiet breaches is one of the responsibilities we all share.

“Miss, what’s GDPR? Is it a new exam board?”

That question might have raised a chuckle back in 2018, when GDPR first arrived with its bundle of acronyms, policy updates, and general sense of urgency. But here we are in 2025, seven years on and the UK GDPR is no longer the new kid on the block. It’s settled in, taken its place alongside safeguarding, SEND, and all the other core responsibilities that shape everyday life in education.

The big question is: have we settled in with it?

Looking Back (and Forward)

When the GDPR first landed, there was a flurry of activity; privacy notices were redrafted, training sessions booked, and data audits launched with admirable enthusiasm. Since then, many schools, colleges and universities have found their rhythm with data protection. It’s become part of the background noise of school life: necessary, not always exciting, but undoubtedly important.

Still, while the panic may have subsided, that doesn’t mean the pressure has. In fact, the expectations have only grown. With the increasing use of digital tools in classrooms, the rise of online learning, and greater public awareness of privacy issues, it’s no longer enough to simply tick the GDPR box and carry on. The way we manage personal data has become part of how our communities judge our professionalism and trustworthiness.

And rightly so. In education, we hold some of the most sensitive information people will ever share; from learning needs and medical information, to safeguarding records and home circumstances. Protecting that data is part of the duty of care we owe to every pupil, student, parent, and colleague.

In recent years, we’ve seen both missteps and good practice across the sector. One well-known case involved a phishing email that led to a serious breach at a large multi-academy trust. Despite having the right policies on paper, the real problem turned out to be a lack of practical staff training. It was a simple mistake, but one with far-reaching consequences.

On the other hand, many institutions have shown what good looks like. Several universities, for example, now include GDPR awareness as part of induction for all staff, and make regular updates part of their professional development cycle. One even ran a student-led privacy campaign, helping young people understand their own rights while building a culture of shared responsibility. The message was clear: data protection isn’t just admin, it’s part of how we show care and respect.

What GDPR Means in 2025

We’re now working in a digital-first education landscape. Learning platforms, behaviour tracking systems, AI-driven learning tools, they all collect and process data in increasingly complex ways. GDPR hasn’t stood still either; the principles remain the same, but the questions we need to ask have evolved.

Are we being transparent with families and students about how their data is used? Are we confident the apps and platforms we rely on are genuinely secure and compliant? Are we sure that only the right people in our organisations can access sensitive information?

These aren’t questions for data managers alone. They’re questions for senior leaders, teachers, support staff – everyone who touches information in any form. Because GDPR is no longer just a legal requirement. In 2025, it’s part of how we show we’re trustworthy professionals.

Seven Years In: What’s Changed?

What’s changed, more than anything, is awareness. Students are more privacy-savvy. Parents are asking sharper questions. Staff are more alert to the risks and responsibilities of handling personal data.

And that’s a good thing.

It means we can shift the conversation from compliance to confidence. When GDPR is built into our culture, not just our policies, it becomes part of a wider approach to doing things well. Much like safeguarding, it becomes part of how we think, plan and care.

So, whether you’re updating a digital platform, emailing student records, or printing off that spreadsheet for a meeting, it’s worth pausing to reflect. Not in fear, but in thoughtfulness. Is this the safest way to handle this information? Do I need to do anything differently? Would I feel comfortable explaining this decision to a parent?

There’s no denying that GDPR doesn’t always feel urgent (until something goes wrong). But it’s one of those quiet responsibilities that says a lot about who you are as educators. It speaks to the trust placed in you, and the way you uphold it day after day.

Seven years on, we’ve come a long way. And with thoughtful leadership, practical systems, and a bit of shared awareness, we’ll keep moving in the right direction.

After all, data protection is really about people and, education has always been good at putting people first.

It’s that time of year again.

The corridors are quieter. The library’s full. Revision guides multiply like gremlins. Students are hunched over past papers, and you’ve lost count of how many times you’ve said, “Just do your best.” Exam season: the annual rite of stress, snacks, and sharp-tipped pencils.

We know the drill. We also know that for some students, it’s more than just pressure. It’s panic. Fear. Sleepless nights. And in some cases, a quiet, growing despair that’s not always easy to spot.

So what happens when one of your students hits a breaking point?

Let’s say you’re a form tutor. One morning, during a routine check-in, a Year 11 student makes an offhand comment that stops you cold: “I’m not sure there’s any point in trying anymore. None of this matters anyway.” They brush it off with a shrug, but something in their tone makes your gut twist.

You’ve got safeguarding training. You know the signs. But now you’re also thinking about what you can and can’t say. What happens if you need to tell someone else? What if they’ve asked you not to? How does GDPR come into play?

Let’s be absolutely clear here: UK GDPR does not prevent you from protecting a student’s wellbeing. The law might sound like it’s all about red tape and locked filing cabinets, but when it comes to emergencies, particularly involving someone’s health or life, it’s surprisingly human.

The key phrase in the legislation is “vital interests.” If a person’s life, health, or safety is at serious risk, you can share their personal information without consent. In fact, in those moments, you’re not just allowed to, you’re expected to act.

So in our example, yes, you absolutely can and should inform your Designated Safeguarding Lead. You might also need to involve mental health services, the student’s parents or carers, or in rare cases, emergency services. You don’t need the student’s permission if you’re worried about their immediate safety. You just need to make a professional, proportionate judgment and record your decision clearly.

That might feel uncomfortable. Students often confide in us because they trust us. It’s natural to want to protect that trust. But safeguarding isn’t about keeping secrets, it’s about keeping people safe. And there’s a way to do both. You can tell the student, calmly and compassionately, that you’re concerned and that you’ll be speaking to someone who can help. It’s not a betrayal. It’s part of the responsibility they trust you to carry.

Let’s take another scenario. An exam invigilator notices a student sobbing quietly during a paper. After the exam, the student says they haven’t eaten in two days because of anxiety. They also beg you not to tell anyone, claiming they’ll be fine. Do you stay silent?

Again, the answer is no. Even if the student appears to be “functioning,” extreme stress, particularly if it’s affecting basic wellbeing like eating and sleeping, can quickly spiral into something more serious. Sharing this concern with your safeguarding lead or school counsellor is entirely appropriate under data protection law. The student’s welfare outweighs their request for secrecy when real harm is at stake.

Of course, not every case is black and white. There will be moments of hesitation. But that’s why having a clear understanding of your school’s safeguarding policy and how it works alongside GDPR is crucial. It helps you act with confidence and compassion.

And let’s not forget the pressure educators are under too. These conversations are emotionally exhausting. You’re juggling exam timetables, parents chasing grades, and students in various states of meltdown. But knowing that the law supports you in putting a student’s mental health first can take one worry off your shoulders.

It’s worth remembering that the Information Commissioner’s Office (ICO) has been vocal on this: data protection is not a barrier to sharing information where someone’s safety is at risk. The myth that “GDPR says no” in these scenarios is not just unhelpful, it’s dangerous.

At the heart of all this is a simple principle: if you’re genuinely worried about a student’s wellbeing, you must act, and the law supports you in doing so. Share what’s necessary, with those who need to know, and keep a clear, factual record of what you’ve done and why.

So as exam season stretches on and stress levels rise, keep your eyes open, your ears tuned, and your instincts sharp. You might be the person a student trusts most. And in a moment of crisis, that trust can be life-changing so long as you’re willing to act on it.

Because sometimes, “just exams” are anything but.

In today’s hyperconnected world, it’s not uncommon for children to be more tech-savvy than the adults around them. But behind every cute dance video or viral meme lies a sophisticated system of data collection and recommendation algorithms, Many of which are now under scrutiny.

Recently, the UK’s Information Commissioner’s Office (ICO) launched formal investigations into three popular platforms; TikTok, Reddit, and Imgur, over concerns about how they handle the personal data of UK children aged 13 to 17. For educators and parents, this is a timely reminder: understanding the digital environments children are immersed in is no longer optional, it’s essential.

TikTok: Personalised, But At What Cost?

TikTok has become a fixture in many young people’s daily lives. Its algorithm seems almost magical in the way it recommends content. But the ICO is now investigating how that “magic” works when it comes to children’s personal information.

Is TikTok collecting too much data? Are its recommendation systems steering children toward inappropriate or harmful content? These are some of the key questions the ICO aims to answer.

This isn’t the first time TikTok has come under fire. In 2023, it was fined £12.7 million for unlawfully processing children’s data. The platform says it has since improved its safeguards, but the ICO is making sure those promises hold up under inspection.

Reddit and Imgur: How Do They Know a User’s Age?

Unlike TikTok, Reddit and Imgur aren’t in trouble for what they’re showing children, but rather how they determine whether someone is a child in the first place.

Currently, many platforms rely on self-declared age checks, systems that are easy for children to bypass. The ICO is now investigating whether Reddit and Imgur have adequate age verification in place to prevent underage users from accessing potentially inappropriate content.

Reddit has acknowledged the issue and says it plans to implement stronger age verification. Imgur has not yet commented publicly.

The Children’s Code: Why It Matters

These investigations are part of a broader regulatory effort called the Children’s Code, introduced in 2021. The Code sets out 15 standards that online services must meet to ensure children’s personal data is protected.

At its heart, the Code is built on a simple principle: what’s best for the child must come first.

That means:

  • Minimising data collection
  • Avoiding manipulative design
  • Clearly explaining how data is used
  • Ensuring privacy settings are age-appropriate by default

As adults, we play a critical role in helping children navigate digital spaces safely. The ICO’s investigations are an important step toward greater accountability, but regulation alone isn’t enough.

Here are a few ways you can help:

Start the Conversation

Ask children what apps they use and how they feel about them. Make tech a shared topic, not a private one.

Teach Critical Thinking

Encourage young people to question why they’re being shown certain content. What’s the platform hoping they’ll do next: watch more, click something, buy something?

Stay Informed

Keep up with digital safety guidance from trusted sources like the ICO, NSPCC, and Childnet.

Use Tools and Settings

Explore built-in safety and privacy controls on apps your child uses. These can often be customised to offer better protection.

The internet can be a wonderful place for learning, creativity, and connection. But it must also be a safe and respectful space for children. The ICO’s investigations send a clear message to tech companies: if you want to benefit from children’s attention, you must also earn their trust.

Let’s continue working together, as parents, teachers, and guardians, to ensure that the digital world treats our children with the care, respect, and dignity they deserve.

GDPR Sentry can help you fill the knowledge gap

As cyber threats escalate across all sectors, UK schools have become increasingly frequent targets. From ransomware attacks that cripple entire networks to phishing campaigns aimed at staff and suppliers, the education sector is now considered a prime target for cybercriminals. According to the UK’s National Cyber Security Centre (NCSC), the volume and sophistication of attacks on schools have been rising steadily since 2020, with 2024 seeing one of the highest annual surges yet.

For Data Protection Leads (DPLs) and school administrators, the message is clear: safeguarding your community’s data is no longer just a GDPR requirement, it’s a critical frontline defence for education continuity, reputation, and trust.

Why Are Schools in the UK Being Targeted?

Schools manage vast amounts of sensitive data, from student records and safeguarding information to financial details and health data, all governed under the UK GDPR. At the same time, many schools rely on legacy systems, under-resourced IT infrastructure, or lack full-time cybersecurity expertise.

Threat actors know this. They exploit gaps in security awareness, outdated software, and insufficient incident response planning.

In recent months, several UK schools have reported:

  • Ransomware attacks that encrypted entire networks, halting teaching and learning for days
  • Phishing scams impersonating school leadership or DfE officials
  • Data breaches that triggered investigations by the Information Commissioner’s Office (ICO)
  • DDoS attacks during exam periods, disrupting access to remote systems

These aren’t hypothetical risks, they’re happening now.

What Can UK Schools Do to Minimise Risk?

As a DPL or school leader, you’re in a unique position to lead both compliance and culture. Below are seven actionable steps to significantly strengthen your school’s cybersecurity posture:

  1. Embed Cybersecurity into Data Protection Training

Cybersecurity and data protection go hand in hand. Ensure all staff, including teaching assistants and office staff, receive regular, mandatory training on:

  • Identifying phishing emails
  • Secure handling of pupil data
  • Using strong, unique passwords
  • What to do if they click on something suspicious
  1. Implement Multi-Factor Authentication (MFA) Across Systems

If you’re still using single-sign-on credentials for MIS, payroll, or email systems, now is the time to act. MFA drastically reduces the risk of unauthorised access, especially in cloud-based systems like Google Workspace for Education or Microsoft 365.

  1. Keep Software and Devices Updated

Cybercriminals often exploit outdated software with known vulnerabilities. Set systems to automatically install updates where possible. This includes:

  • Operating systems (Windows, macOS, ChromeOS)
  • Web browsers
  • MIS and safeguarding software
  • Antivirus and firewall tools

Work with your IT provider to audit devices used by staff working remotely.

  1. Back Up Data Securely and Test Recovery

Regular, encrypted backups, stored separately from your main network can be the difference between recovery and disaster. But backups only help if they’re tested.

Schedule termly backup recovery tests with your IT team or managed service provider.

  1. Review Third-Party Data Sharing

Many schools use third-party edtech tools. Ensure that suppliers:

  • Comply with UK GDPR
  • Have robust cybersecurity practices
  • Are listed in your Record of Processing Activities (ROPA)

Review contracts and data sharing agreements annually.

  1. Create and Test an Incident Response Plan

If your school is attacked, how will you respond? Who will inform the ICO, parents, or the DfE? Your incident response plan should include:

  • Clear roles and responsibilities (including DPL, Headteacher, IT lead)
  • Communication templates
  • Steps for isolation, containment, and recovery
  • A reporting mechanism to the ICO within 72 hours (as required under UK GDPR)
  1. Promote a ‘Whole School’ Security Culture

Cybersecurity isn’t just an IT issue; it’s an organisational culture issue. Consider adding cybersecurity awareness to staff induction, governor briefings, and safeguarding policies.

Security Is a Shared Responsibility

The NCSC and DfE have made clear that schools must prioritise cyber resilience alongside safeguarding and curriculum planning. For DPLs and school administrators, this means moving beyond compliance and toward proactive, strategic risk management.

The stakes are high, but so is the opportunity to lead. By investing in prevention, awareness, and preparation, your school can protect both its people and its purpose.

Imagine you’re sat in the staffroom, half-eaten biscuit in one hand, half-finished student spreadsheet on the screen. You scroll down a list of names, notes, and numbers, some of which, to your horror, date back to the year David Cameron was still Prime Minister.

You mutter to yourself, Why do we still have this data? Is this even legal?

Well, you’re in luck. Help may be on the horizon, wrapped in bureaucracy yes, but still help.

The UK’s Data (Use and Access) Bill is the government’s latest attempt to bring data protection into the 21st century without sending educators and administrators into panic-induced paper purges. So, what does it mean for you and your school? Let’s unpack it without the legalese, and with your sanity in mind.

So… What Is This Bill Actually About?

At its core, the Data (Use and Access) Bill (or DUAB, if you’re into acronyms) is about striking a balance between making better use of data and protecting people’s rights, especially children’s.

It’s part of the government’s broader post-Brexit effort to move away from some of the more rigid elements of EU GDPR, while still holding onto the values that matter; transparency, safety, and accountability. Think of it as GDPR’s sensible cousin, still serious, but a little more practical in the school setting.

Why Should Schools Care?

Here’s the thing, schools are absolute treasure troves of personal data. From safeguarding notes and behavioural logs to dinner money apps and biometric attendance systems, they gather data like Year 7s gather Pokémon cards, except there’s legal liability attached.

This Bill is nudging us gently but firmly towards smarter, clearer, and more responsible data use. For instance, it’s placing extra emphasis on how we use children’s data in digital tools and platforms. That means reviewing whether educational software is using personal information appropriately, and not quietly siphoning it off to train some mysterious AI model in the background.

Also under the spotlight? Automated decision-making. If you’ve ever wondered whether a student’s algorithmic “progress tracker” is making assumptions you wouldn’t make as a teacher… well, the DUAB has your back. It demands transparency and human oversight when important decisions are being made based on data. Because let’s face it, no algorithm knows your pupils like you do.

But Wait, There’s More…

One of the big ideas in the DUAB is around data retention. Remember that ancient spreadsheet I mentioned earlier? Under the Bill, keeping data “just in case” won’t cut it anymore. Schools will need clear justifications for how long data is kept and must be able to show they’re not hoarding it unnecessarily. It’s like a spring clean, but for your school server.

The Bill also introduces measures to simplify compliance. For schools, this could mean fewer hoops to jump through when working with third-party apps or local authorities, as long as the data use aligns with the public good and proper protections are in place.

So, What Should We Do Now?

First off, don’t panic. This Bill isn’t a ticking time bomb. It’s more of a nudge to think seriously about how we treat data in our schools and to embed that into our day-to-day decision-making.

It’s a good time to:

  • Talk to your school’s Data Protection Officer (you know, the one who pops up every year reminding everyone about GDPR).
  • Review your school’s data retention schedule – are you keeping stuff longer than necessary?
  • Ask questions about any new edtech platforms you’re trialling. Are they transparent? Safe for students? Do they actually need all the information they’re collecting?

And finally, keep the conversation going. Data protection isn’t just a compliance issue, it’s about trust. Parents trust us with their children. Students trust us with their futures. Managing their data responsibly is part of honouring that trust.

Ultimately, this isn’t just a policy update, it’s a cultural shift. The DUAB reminds us that data is more than a digital asset. It’s personal. It’s powerful. And in education, it’s deeply human.

So next time you open a spreadsheet that hasn’t been touched since the last Ofsted inspection, take a moment. Ask yourself not just “Do we need this?” but also “Is keeping this still respectful to the person behind the data?”

Because in the classroom, in the office, or even in the server room, one truth remains: good education starts with good ethics.

There’s something unmistakable in the air when an inspection is on the horizon. You can feel the hum of preparation in every corridor; policies being printed, classroom displays getting a refresh, and colleagues exchanging knowing glances over the photocopier. It’s all hands-on deck, and every detail matters.

But as walls are re-pinned and cupboards reorganised, another question often arises quietly in the background: Are we still within the bounds of data protection law?

The answer isn’t always as straightforward as we’d like. In fact, GDPR considerations often become most visible in the very things we proudly display on classroom walls, in shared corridors, or on digital screens. And during inspection season, those questions only feel more urgent.

Let’s consider a familiar example: a colourful “Star of the Week” board, complete with names, photos, and personal achievements. It’s a lovely way to celebrate success but what if one of those pupils has a parent who didn’t give consent for photos? Or a safeguarding concern that makes public identification risky? Even the most well-intentioned display can inadvertently stray into problematic territory.

The same applies to medical or allergy information. Many schools use posters or visual aids to make staff aware of pupil needs, particularly in lunch halls or near staff kitchens. But if that information includes photos, names, and medical conditions in areas accessed by other pupils or visitors, it crosses into “special category data” under UK GDPR. That kind of information requires extra care.

Digital spaces are no less important. In the rush to prepare documents, it’s easy to leave a screen open or a shared drive exposed. But if personal pupil data is left visible or accessible to those without a legitimate reason to view it, the school could find itself facing not only a privacy concern, but potentially a reportable data breach.

None of this means schools must remove every trace of student celebration or wrap the walls in plain paper. GDPR doesn’t ask us to stop recognising achievement, it asks us to think critically about how we do it.

One school I worked with ran what they called a “privacy walk” in the days leading up to an inspection. Staff took ten minutes to walk through shared spaces with fresh eyes, asking themselves: Can visitors see anything they shouldn’t? Are personal details on show unnecessarily? Are we being mindful with how we display medical or safeguarding information? It was a quick, simple exercise that made a measurable difference to their overall compliance, and their confidence, on inspection day.

Similarly, one teacher I spoke to found an elegant solution to a parent’s concern over public displays of progress. Instead of using names on her reward chart, she assigned each child an animal symbol; “Team Owl,” “Team Fox,” and so on. It respected privacy, kept parents satisfied, and the pupils embraced it wholeheartedly.

What matters most is that schools are able to demonstrate thoughtful, proportionate decision-making. Inspectors don’t expect perfection, but they do expect to see that staff understand the principles of data protection and have taken reasonable steps to comply.

If you’re unsure about a display, a chart, or a staffroom noticeboard, ask yourself: Is it necessary? Is it proportionate? And have we obtained the right consent, where needed? If you’re still unsure, speak to your data protection lead or DPO. Getting the answer right before inspection day is far easier than addressing concerns after the fact.

Ultimately, compliance isn’t about red tape, it’s about respect. Respecting pupils’ rights, respecting families’ expectations, and respecting the trust placed in schools to safeguard not only children, but their information.

So as inspection season gathers pace, take a moment to review the little things. The name tags, the photo walls, the charts with more detail than needed. Because when the inspector walks in and the questions start, you’ll be glad you did.

It’s not every day you hear about global privacy treaties in the staffroom. Between lesson plans, playground duties, and wondering why the photocopier only jams when you’re in a hurry, international data agreements don’t always make it onto the radar.

But every now and then, something big happens that’s worth pausing for and, Global CAPE is one of those somethings.

So let’s break it down. No jargon, no legal waffle, just a clear, narrative explanation of what’s going on and why it matters to schools.

First Things First: What on Earth is Global CAPE?

Global CAPE stands for the Global Cooperation Arrangement for Privacy Enforcement. Think of it like an international support group but for data protection authorities. It’s a framework that allows privacy regulators from different countries to work together more easily, especially when investigating cross-border data misuse.

Why is this needed? Because in 2024, data doesn’t stay local. A student might use an education app built in California, hosted in Ireland, with customer support in Singapore. If something goes wrong with how that app handles personal information, who’s in charge?

That’s where Global CAPE comes in. It makes it easier for regulators to:

  • Share information securely,
  • Cooperate on investigations, and
  • Support each other when tackling global privacy issues.

In short, it gives watchdogs more teeth and a few more colleagues to back them up.

So, Why Has the ICO Joined?

Earlier this year, the UK’s Information Commissioner’s Office (ICO) officially joined Global CAPE. It’s part of a growing list of data protection authorities who recognise that privacy is no longer a purely domestic issue.

For the ICO, this move means:

  • More muscle in investigating international companies that may misuse UK citizens’ data.
  • A seat at the table when setting expectations for global data handling, especially for emerging tech.
  • Better collaboration with other countries to address risks affecting UK children and families.

As the ICO put it, this step helps ensure that UK citizens remain protected, even when their data travels the world. And let’s be honest, when it comes to edtech and online tools in schools, data is travelling the world.

Who Else Is in This Data Protection Dream Team?

The list of members includes privacy authorities from:

  • The United States (particularly the Department of Commerce)
  • Australia
  • Canada
  • Japan
  • South Korea
  • Mexico
  • The Philippines
  • Singapore
  • And now, the United Kingdom

It’s a truly international group and one that continues to grow.

Why Should Educators Care?

At this point, you might be thinking: Okay, but how does this affect me, the classroom teacher or school leader?

Here’s the link.

As more schools adopt cloud-based learning platforms, communication apps, and AI-powered tools, the question of where data goes and who is accountable when things go wrong is more important than ever.

When you use an app to track student behaviour, does the data stay in the UK? What happens if that company is based abroad and suffers a breach?

Global CAPE means the ICO can now collaborate more effectively with overseas regulators. That adds a layer of reassurance for schools, especially those using international tools and platforms.

And for school leaders, it sends a gentle but important signal that data protection isn’t just a tick-box, it’s a global issue. The choices you make around platforms, permissions, and parental consent really do matter.

In the grand scheme of education, Global CAPE might not change your day-to-day immediately. But it’s part of a wider story: one where governments are finally realising that digital rights, especially for children, need international protection.

And while your focus may rightly stay on helping students grow, learn, and thrive, you can also be confident that the data trail they leave behind is being watched over by more than just your school server.

Because privacy doesn’t stop at the school gate anymore and now, neither does enforcement.