Posts

Let’s be honest: GDPR isn’t the most thrilling part of your job. It’s not going to earn you a gold star or a round of applause in the staffroom. But it will keep your school safe, your students protected, and your inbox free from the dreaded “URGENT: Possible Data Breach” email.

So if you’re an educator juggling lesson plans, safeguarding concerns, and the occasional coffee spill, here’s your friendly reminder that data protection is a daily habit, not a once-a-year training module.

Let’s revisit our trusty trio – Lock it, Mask it, Stash it, and add a few more good habits.

Lock It – Because Screens Don’t Need an Audience

Scenario:
You’re called out of your classroom to deal with a playground incident. In your rush, you leave your laptop open on your desk with a safeguarding report on the screen. A curious student wanders in and sees more than they should. Later, they mention it to a friend, and suddenly sensitive information is circulating.

Good habit:
Hit Windows + L or Ctrl + Command + Q on a Mac every time you step away. Set your device to auto-lock after a few minutes. It’s like closing the door on sensitive data.

 

Mask It – Anonymise Like You’re in a Spy Movie

Scenario:
You’re leading a CPD session and want to showcase a brilliant piece of student work. You project it on the screen… and realise too late that it includes the student’s full name, class, and SEN status. A visiting governor notices and raises concerns.

Good habit:
Before sharing, ask: “Could someone identify this student?” If yes, anonymise. Use initials, remove photos, blur names. Think MI6, but with worksheets.

 

Stash It – Store Smart, Not Just Somewhere

Scenario:
You save a spreadsheet of pupil premium data to your desktop “just for now.” Weeks later, your laptop is stolen from your car. The file wasn’t encrypted. Cue panic, paperwork, and a very uncomfortable conversation with leadership.

Good habit:
Use school-approved cloud storage. Encrypt sensitive files. Never store personal data on USB sticks unless they’re encrypted and never leave them in your coat pocket.

 

Clear It – Declutter Your Digital Life

Scenario:
You search your inbox for a parent’s email and stumble across a safeguarding document from three years ago. You don’t need it anymore, but it’s still sitting there, vulnerable. If your account were compromised, that data could be exposed.

Good habit:
Schedule a monthly “data declutter.” Delete what’s outdated, archive what’s essential. GDPR loves a tidy inbox.

 

Speak It – Make Data Protection a Team Sport

Scenario:
You notice a colleague leaving their screen unlocked in the staffroom while they grab a coffee. You hesitate to say anything… but what if a student walks in? Or a visitor?

Good habit:
Speak up. Start a “Data Protection Minute” in staff meetings. Share one tip, once a month. It’s painless, and it builds a culture of care.

 

Check It – Know What You’re Collecting (and Why)

Scenario:
You create a Google Form to collect feedback from students. You ask for full names, dates of birth, and home addresses (none of which are actually needed). A parent complains after seeing the form.

Good habit:
Stick to data minimisation. Only collect what’s necessary. If you don’t need it, don’t ask for it. Less data = less risk.

 

Password It – Strong, Unique, and Not “Password123”

Scenario:
You use the same password for your school email, your personal Netflix account, and your online shopping. One breach, and everything’s exposed. Worse, your school account is used to send phishing emails to parents.

Good habit:
Use a password manager. Enable two-factor authentication. And never write your password on a sticky note stuck to your monitor (yes, we’ve all seen it).

 

Log It – Keep Track of What You Share

Scenario:
You email a spreadsheet to a colleague containing student attendance data. A week later, they can’t find it, and you’re unsure who else might have access. You didn’t password-protect it, and now it’s floating around inboxes.

Good habit:
Keep a simple log of what you’ve shared, with whom, and when. Use password protection for sensitive documents and send passwords separately.

 

You don’t need to be a tech wizard or a GDPR guru. You just need to be mindful, and act. Every locked screen, anonymised document, and securely stored file is a step toward a safer, smarter school.

So next time you’re tempted to save something “just for now” or leave your laptop unattended, remember: Lock it. Mask it. Stash it. Clear it. Speak it. Check it. Password it. Log it.

Your students deserve it. Your school depends on it. And your future self will thank you.

Following on from our post around lawful basis in September, let’s dive in with a deeper look at legitimate interests.

 

You’re sitting at your desk, reviewing your school’s privacy notice, and there it is again:

“We process personal data where necessary for our legitimate interests.”

It’s a phrase you’ve seen in countless templates, DPIAs, and supplier agreements. But since the Data Use and Access Act 2025 came into force, you find yourself asking:

What does “legitimate interests” actually mean now? And how do I use it responsibly in a school setting?

Because here’s the truth: the landscape has changed.

What was once a flexible and widely used lawful basis under UK GDPR is now subject to greater scrutiny and clearer expectations.

And you need to be ready.

 

First: A Refresher on Legitimate Interests

At its core, legitimate interests allows you to process personal data without needing consent, as long as:

  1. You have a clear and genuine interest (yours or a third party’s),
  2. The data use is necessary to achieve it,
  3. And your interest doesn’t override the individual’s rights and freedoms.

Simple? Not quite.

Legitimate interest isn’t a free pass. It’s a balancing act. One that now requires more than just good intentions.

 

What the Data Use and Access Act Changes

The 2025 Act doesn’t throw out legitimate interests, but it tightens the rules around it, particularly in public sector settings like schools.

Here’s what’s new:

  1. Recognised Categories of Legitimate Interest

The Act introduces “recognised legitimate interests”. A defined list of purposes where the balancing test is still required, but more likely to pass, especially in regulated environments like education.

These include:

  • Safeguarding and welfare monitoring
  • IT security and misuse prevention
  • Internal data analytics for educational improvement
  • Health and safety incident logging
  • Communications with former pupils (e.g. alumni outreach)

This doesn’t give you automatic permission, but it gives you a stronger footing, provided you still assess necessity and impact.

  1. Mandatory Legitimate Interest Assessments (LIAs)

Whereas under UK GDPR, LIAs were strongly recommended but not mandatory, the new Act requires written documentation of your balancing test for all non-trivial uses of legitimate interest.

This includes:

  • The purpose and justification
  • Why the data is necessary
  • Potential risks to the individual
  • Mitigating actions you’ve taken

In other words: if you can’t show your thinking, it doesn’t count.

  1. More Transparency in Privacy Notices

The Act demands greater clarity in how you explain legitimate interest to data subjects.

Saying “We do this for our legitimate interests” is no longer enough.

Now you must:

  • Describe the actual purpose (“To improve learning outcomes across year groups”)
  • State the lawful basis clearly
  • Explain the individual’s right to object, and how they can do it

This isn’t just legal hygiene, it’s about building trust.

 

Real-Life Scenarios You Might Recognise

Let’s put this into context. Here are a few examples of recognised legitimate interests that now carry clearer boundaries under the Act:

Using Assessment Data for Internal Improvement

You aggregate anonymised pupil data to review trends and improve teaching strategies.
– Recognised legitimate interest
– You still need to ensure no unintended profiling or bias is introduced.

Contacting Alumni About School Events

You keep in touch with former pupils for fundraising or newsletters.
– Now listed as a recognised interest, but you must allow opt-outs and be proportionate.

Monitoring Staff IT Activity

You review system logs to detect inappropriate use of school devices.
– Still valid, but the Act now emphasises proportionality and the need for staff to be informed via policy and training.

Using CCTV for Security

– Covered by recognised interest, but the Act now expects specific signage and policies outlining how footage is used and retained.

 

What You Should Do Now

If you’re relying on legitimate interests in any part of your school’s data processing, take a moment to check:

  • Have you documented your legitimate interest assessments?
  • Is your privacy notice specific and accessible?
  • Have you reviewed whether your interests are listed in the new recognised categories?
  • Have you trained staff on when legitimate interests apply and when they don’t?

And most importantly:

  • Can you explain your decision clearly to a parent, pupil, or regulator?

 

One Final Thought

Legitimate interests used to be the “quiet lawful basis” sitting in the background while consent and legal obligation took the spotlight.

But now, it demands more from you.

More transparency. More justification. More trust.

So the next time you tick that box, ask yourself:

“Would I be comfortable explaining this decision face-to-face with the person it affects?”

If the answer is yes, you’re likely on solid ground. If not? It might be time for a rethink.

Let us help you out of the maze

Let’s be honest: when most of you became educators, you dreamed of inspiring minds, not mastering the fine print of data protection law. But here you are, in a post-GDPR, post-DUAA world, where understanding the “lawful basis for processing” is as crucial as remembering to take the register.

The good news? While the Data (Use and Access) Act 2025 (DUAA) has introduced a few tweaks, the core principles remain intact (just sharpened), especially for education settings. Once you get past the legalese, the lawful bases for processing personal data are more intuitive than they seem and, dare I say, even a little philosophical.

Let’s take a look.

 

The “Why” Before the “How”

Before you collect, use, or store any personal data, from student attendance to parent emails, even the staff lunch allergy list, you must know why you’re doing it. And that “why” must align with one of six lawful bases set out by the UK GDPR.

Think of it like this: every piece of personal data is a guest at a party. You need a legitimate reason for inviting them in, or you risk violating privacy laws (and no one wants to be that school on the front page for a data breach).

Now let’s meet our six lawful gatecrashers, refreshed for 2025.

 

  1. Public Task: The Educator’s Bread and Butter

If your school or university is a public authority, public task remains your primary basis. You’re carrying out functions in the public interest; teaching, safeguarding, administering exams. The new DUAA reinforces this by clarifying that digital education tools and platforms used to fulfil statutory functions also fall under this basis.

Example: Recording attendance or using a digital behaviour management system. You’re not doing it for fun (although some platforms are slick), but because it’s integral to your duties.

Bonus: You still don’t need consent. In fact, the DUAA discourages unnecessary consent requests where a clearer basis applies, to avoid undermining trust or creating confusion.

 

  1. Consent: Still Not Always the Golden Ticket

Everyone loves a good consent form. Until they realise it’s not always the right tool. Under GDPR, consent must be freely given, informed, and withdrawable. Adding to that, under the DUAA, there’s increased emphasis on ensuring consent is genuinely voluntary, particularly when dealing with minors or those in a dependency relationship. That can be tricky in a school setting where there’s a clear power imbalance.

When it works: Using student photos on your school website or social media. This is optional, celebratory, and low risk. The classic consent use case.

When it doesn’t: Collecting data for the school lunch system or learning platforms required by the curriculum. The DUAA 2025 warns against “coerced consent” in core service delivery, especially in education.

 

  1. Legal Obligation: Because the Law Says So

This one’s straightforward. If another law requires you to process data, GDPR says “go ahead.” However, there is now more focus on clarity and specificity.

Example: Disclosing safeguarding concerns to the local authority or providing payroll data to HMRC. No need to ask nicely; the law’s already spoken.

But be careful. Vague references to “policy” aren’t enough. Be ready to cite specific laws (e.g. Education Act 1996, Children Act 1989, Safeguarding Vulnerable Groups Act 2006). The DUAA encourages clearer audit trails for these decisions.

 

  1. Contract: Mind the Fine Print

More relevant to universities and colleges, this applies when data processing is necessary for a contract (e.g., enrolment, accommodation).

Example: : A university processing student accommodation requests or managing course enrolment. The student signed up for a degree; you need to process data to deliver on your end of the bargain.

It can also apply to employment contracts.

DUAA update: The Act highlights the growing use of AI and third-party tools in education, requiring contracts with digital vendors to include clear data use terms. Processing data under contract must be strictly necessary, not just convenient.

 

  1. Legitimate Interests: The Wild Card, with Stricter Controls

This basis has always been the most flexible and misunderstood. The DUAA has added extra scrutiny for public authorities like schools when using it.

Example: Using CCTV at an independent school for safety. Your interest (keeping people safe) outweighs the slight intrusion on privacy, as long as it’s well-signposted and proportionate.

But if you’re a public authority, use this one sparingly. The ICO tends to raise an eyebrow when schools over-rely on it.

New under DUAA: Public bodies must not use this basis for tasks where another basis (e.g., public task) is more appropriate. The DUAA also introduces a “Legitimate Interests Assessment (LIA) log” for higher risk uses. A lightweight but structured way to show you’ve considered the risks and mitigations.

 

  1. Vital Interests: Still for Emergencies Only

This is your break-glass-in-case-of-emergency option. Life-and-death situations only. It’s rarely used, but crucial when needed.

Example: A student collapses, and you need to share medical information with paramedics. GDPR isn’t going to stand in your way.

If you’re using this basis often, your emergency planning may be lacking. The DUAA nudges organisations toward more proactive safeguarding policies rather than relying on this exception.

 

So… How Do You Choose the Right One?

Here’s the secret: don’t overthink it. Start by asking yourself:

  • Why are we collecting this data?
  • Is it part of our statutory duties?
  • Is there a specific law that applies?
  • Would the individual expect this use?
  • Can we achieve the goal without processing personal data?

Once you’ve chosen your lawful basis, stick to it. You can’t switch to “consent” just because someone complains. And yes, you must inform individuals which basis you’re using in your privacy notice and be prepared to justify it. Trust me, having this straight makes life easier when the ICO, or a curious parent, comes knocking,  especially when using automated systems or AI-assisted platforms.

Lawful basis isn’t just a box-ticking exercise. It’s a lens through which we examine how and why we handle people’s information. The DUAA may have added a few sharper edges and new expectations, but the heart of data protection remains the same: respect. Whether you’re an early years teacher or a university data officer, understanding this stuff isn’t just compliance.

It’s care.

On the 19th of June 2025, the Data Use and Access Act (commonly known as the DUAB or DUA Bill) officially received Royal Assent, marking a significant update to the UK’s data protection framework. While it doesn’t entirely rewrite GDPR, it does bring a number of key changes that educational institutions need to be aware of.

Let’s break it down and explore what this means for colleges, schools, and educational staff in a practical, easy-to-digest way.

Subject Access Requests: The Clock Pauses for Clarification

We know that managing Subject Access Requests (SARs) can be time-consuming, especially when students or staff request a mountain of data.

Here’s what’s changed:

  • If someone submits a SAR and you need clarification, the response deadline is paused until they respond.
  • You can reasonably ask for clarification if you hold a large amount of data about the person.
  • Once clarification is received, you must still perform a reasonable and proportionate search for the requested data.

Takeaway for educators: If a student or staff member submits a vague or broad SAR, you now have the legal room to ask for specifics before the 30-day countdown begins. This helps you stay compliant and focused on what’s actually needed.

Complaints Must Be Handled Internally First

The new law gives organisations a chance to fix things before complaints go to the ICO (Information Commissioner’s Office).

Now, if someone raises a data protection issue:

  • You must acknowledge their complaint within 30 days
  • Take appropriate steps to investigate and address it
  • And inform them of the outcome

What this means: Colleges should ensure there’s a clear internal complaint process. This is a great time to check that your staff know how to escalate issues internally, before they become a bigger (and more public) problem.

Automated Decision-Making & AI in Education

With AI becoming a growing tool in education, think automated exam marking or application screening, DUAB clarifies how these systems should be handled:

Automated decisions that have legal or significant effects (e.g. admissions, grading) are permitted, as long as:

  • The individual is told their data will be used this way,
  • They are given the option to contest the decision, and
  • Human intervention is available.

Heads up: If your college uses AI to grade tests (like automated multiple-choice scoring), or any automated student decision-making systems, let your data protection lead know. You’ll need to be clear with students about their rights and the role AI plays.

Cookies: A Bit More Flexibility

Let’s talk cookies. No, not the chocolate chip kind.

DUAB introduces exemptions to the consent requirement for certain non-essential cookies. That includes cookies used for:

  • Statistical analysis
  • Website performance
  • User preferences
  • Service improvements

Good news for IT teams: If your college’s website uses cookies only for these purposes, you no longer need to obtain user consent.

Legitimate Interests: A Clearer Path Forward

There’s now a recognised list of “legitimate interests”, making life a bit easier for data protection leads. These include:

  • Safeguarding students and staff
  • Fraud prevention
  • Network and system security
  • Intra-group data sharing (for multi-site colleges)
  • Direct marketing

If your college is processing data for any of the above reasons, you no longer need to complete a Legitimate Interests Assessment (LIA).

Bottom line: This streamlines processes and reduces paperwork when handling security or safeguarding-related data.

What’s Next?

While these updates are now law, the government hasn’t yet published education-specific guidance. In the meantime:

  • Review your current policies, especially around SARs, complaints, and automated decision-making
  • Update your internal processes and staff training to reflect the new rights and obligations
  • Monitor for upcoming sector-specific guidance

If you’d like to explore more detailed legal interpretations or dive deeper into how DUAB changes things across different industries, click here for further information.

Let’s continue to champion responsible data use in our schools and colleges. These changes aim to strike a better balance between protecting individuals’ rights and reducing unnecessary red tape – and that’s something we can all get behind.

In a quiet primary school on the edge of town, a well-meaning teaching assistant printed a spreadsheet containing student health details. He left it in the staffroom just for a moment while making a cup of tea. When he returned, the document had vanished. It later turned up, crumpled and smudged with jam, in the Year 2 book corner.

No hackers, no malware. Just a simple, very human mistake.

When we hear “data breach,” it’s easy to picture a dramatic cyberattack. Dark rooms filled with glowing screens, stern-faced experts speaking in code, maybe even a ransom note demanding cryptocurrency. But in schools, breaches often take a much quieter, more familiar form. They’re the result of everyday errors and oversights, the kind we don’t always see as serious, even when they are.

In educational settings, personal data flows constantly. From pupil records and safeguarding notes to staff files, medical information, and assessment data – it’s all there, quietly underpinning the day-to-day rhythm of school life. And while schools have come a long way in formalising policies and investing in secure systems, there’s one area where vulnerability lingers: the forgotten, overlooked breaches.

Sometimes, it’s about destruction rather than disclosure. Take the school secretary who accidentally deletes a folder containing behavioural reports during a routine system tidy-up. Or the teacher who loses a handwritten safeguarding log in a spring clean, thinking it was scrap paper. It’s easy to assume that if no one else accessed the data, there’s no breach. But loss and destruction, intentional or not, can have a serious impact on the individuals that data relates to.

Alteration is another blind spot. It might seem harmless when a colleague “tweaks” a student’s medical note for clarity or edits a pastoral record without documenting the change. But even small, unauthorised edits can lead to confusion or worse, misinformed decisions. One school found itself in difficulty after a student’s allergy record was edited to say, “mild intolerance” rather than “anaphylaxis.” A well-meaning change, but a deeply risky one.

And then there’s unauthorised access, often accidental but nonetheless serious. A casual conversation in the corridor mentioning a child’s social services involvement. A screen left unlocked during a lunchtime rush. A well-intentioned parent volunteer glimpsing confidential student data while helping out in the office. No malicious intent, but the boundaries blur all the same.

Loss of data is just as often physical as it is digital. Laptops go missing, USB drives slip between sofa cushions, old filing cabinets are emptied into bins without checking the contents. One secondary school discovered that its entire archive of paper attendance logs had been mistakenly shredded during a storage room clear-out. Years of data, gone in an afternoon.

These are not stories of negligence or malice. They’re stories of busy people juggling multiple responsibilities, making small decisions in a fast-paced environment. But these small moments, multiplied across a school, can create quiet cracks in data protection, cracks where trust can quietly seep away.

So, how do we raise awareness without creating fear?

First, we shift the way we talk about data breaches. Instead of painting them as rare and technical, we frame them as something all of us have a role in preventing. This isn’t about panicking over paperwork, it’s about embedding a culture of care, where data is treated with the same respect as student safety or wellbeing.

Telling real, relatable stories helps. A GDPR workshop might prompt yawns but a discussion about how a misdirected email affected a vulnerable student lands very differently. It’s not about ticking boxes; it’s about protecting relationships.

Second, we make it safe for staff to report near misses. Too often, people worry about being blamed or embarrassed. But a culture of openness turns mistakes into opportunities to learn and improve. A teacher who admits to accidentally sharing the wrong document helps the whole school get better.

And finally, we remember that leadership sets the tone. When senior staff prioritise data protection not just as compliance but as a matter of integrity, it filters down. When they model good habits such as locking screens, questioning poor practices, inviting feedback; it becomes part of the school’s DNA.

In schools, trust is everything. Families trust staff to keep their children safe, not just physically, but emotionally and digitally too. That trust isn’t only built through grand gestures; it’s upheld in the smallest details. How we store, share, and respect the information we hold.

Because in the end, a breach doesn’t have to be loud to be damaging. It can be a file lost, a conversation overheard, a folder carelessly edited. And protecting against those quiet breaches is one of the responsibilities we all share.

In today’s hyperconnected world, it’s not uncommon for children to be more tech-savvy than the adults around them. But behind every cute dance video or viral meme lies a sophisticated system of data collection and recommendation algorithms, Many of which are now under scrutiny.

Recently, the UK’s Information Commissioner’s Office (ICO) launched formal investigations into three popular platforms; TikTok, Reddit, and Imgur, over concerns about how they handle the personal data of UK children aged 13 to 17. For educators and parents, this is a timely reminder: understanding the digital environments children are immersed in is no longer optional, it’s essential.

TikTok: Personalised, But At What Cost?

TikTok has become a fixture in many young people’s daily lives. Its algorithm seems almost magical in the way it recommends content. But the ICO is now investigating how that “magic” works when it comes to children’s personal information.

Is TikTok collecting too much data? Are its recommendation systems steering children toward inappropriate or harmful content? These are some of the key questions the ICO aims to answer.

This isn’t the first time TikTok has come under fire. In 2023, it was fined £12.7 million for unlawfully processing children’s data. The platform says it has since improved its safeguards, but the ICO is making sure those promises hold up under inspection.

Reddit and Imgur: How Do They Know a User’s Age?

Unlike TikTok, Reddit and Imgur aren’t in trouble for what they’re showing children, but rather how they determine whether someone is a child in the first place.

Currently, many platforms rely on self-declared age checks, systems that are easy for children to bypass. The ICO is now investigating whether Reddit and Imgur have adequate age verification in place to prevent underage users from accessing potentially inappropriate content.

Reddit has acknowledged the issue and says it plans to implement stronger age verification. Imgur has not yet commented publicly.

The Children’s Code: Why It Matters

These investigations are part of a broader regulatory effort called the Children’s Code, introduced in 2021. The Code sets out 15 standards that online services must meet to ensure children’s personal data is protected.

At its heart, the Code is built on a simple principle: what’s best for the child must come first.

That means:

  • Minimising data collection
  • Avoiding manipulative design
  • Clearly explaining how data is used
  • Ensuring privacy settings are age-appropriate by default

As adults, we play a critical role in helping children navigate digital spaces safely. The ICO’s investigations are an important step toward greater accountability, but regulation alone isn’t enough.

Here are a few ways you can help:

Start the Conversation

Ask children what apps they use and how they feel about them. Make tech a shared topic, not a private one.

Teach Critical Thinking

Encourage young people to question why they’re being shown certain content. What’s the platform hoping they’ll do next: watch more, click something, buy something?

Stay Informed

Keep up with digital safety guidance from trusted sources like the ICO, NSPCC, and Childnet.

Use Tools and Settings

Explore built-in safety and privacy controls on apps your child uses. These can often be customised to offer better protection.

The internet can be a wonderful place for learning, creativity, and connection. But it must also be a safe and respectful space for children. The ICO’s investigations send a clear message to tech companies: if you want to benefit from children’s attention, you must also earn their trust.

Let’s continue working together, as parents, teachers, and guardians, to ensure that the digital world treats our children with the care, respect, and dignity they deserve.

GDPR Sentry can help you fill the knowledge gap

As cyber threats escalate across all sectors, UK schools have become increasingly frequent targets. From ransomware attacks that cripple entire networks to phishing campaigns aimed at staff and suppliers, the education sector is now considered a prime target for cybercriminals. According to the UK’s National Cyber Security Centre (NCSC), the volume and sophistication of attacks on schools have been rising steadily since 2020, with 2024 seeing one of the highest annual surges yet.

For Data Protection Leads (DPLs) and school administrators, the message is clear: safeguarding your community’s data is no longer just a GDPR requirement, it’s a critical frontline defence for education continuity, reputation, and trust.

Why Are Schools in the UK Being Targeted?

Schools manage vast amounts of sensitive data, from student records and safeguarding information to financial details and health data, all governed under the UK GDPR. At the same time, many schools rely on legacy systems, under-resourced IT infrastructure, or lack full-time cybersecurity expertise.

Threat actors know this. They exploit gaps in security awareness, outdated software, and insufficient incident response planning.

In recent months, several UK schools have reported:

  • Ransomware attacks that encrypted entire networks, halting teaching and learning for days
  • Phishing scams impersonating school leadership or DfE officials
  • Data breaches that triggered investigations by the Information Commissioner’s Office (ICO)
  • DDoS attacks during exam periods, disrupting access to remote systems

These aren’t hypothetical risks, they’re happening now.

What Can UK Schools Do to Minimise Risk?

As a DPL or school leader, you’re in a unique position to lead both compliance and culture. Below are seven actionable steps to significantly strengthen your school’s cybersecurity posture:

  1. Embed Cybersecurity into Data Protection Training

Cybersecurity and data protection go hand in hand. Ensure all staff, including teaching assistants and office staff, receive regular, mandatory training on:

  • Identifying phishing emails
  • Secure handling of pupil data
  • Using strong, unique passwords
  • What to do if they click on something suspicious
  1. Implement Multi-Factor Authentication (MFA) Across Systems

If you’re still using single-sign-on credentials for MIS, payroll, or email systems, now is the time to act. MFA drastically reduces the risk of unauthorised access, especially in cloud-based systems like Google Workspace for Education or Microsoft 365.

  1. Keep Software and Devices Updated

Cybercriminals often exploit outdated software with known vulnerabilities. Set systems to automatically install updates where possible. This includes:

  • Operating systems (Windows, macOS, ChromeOS)
  • Web browsers
  • MIS and safeguarding software
  • Antivirus and firewall tools

Work with your IT provider to audit devices used by staff working remotely.

  1. Back Up Data Securely and Test Recovery

Regular, encrypted backups, stored separately from your main network can be the difference between recovery and disaster. But backups only help if they’re tested.

Schedule termly backup recovery tests with your IT team or managed service provider.

  1. Review Third-Party Data Sharing

Many schools use third-party edtech tools. Ensure that suppliers:

  • Comply with UK GDPR
  • Have robust cybersecurity practices
  • Are listed in your Record of Processing Activities (ROPA)

Review contracts and data sharing agreements annually.

  1. Create and Test an Incident Response Plan

If your school is attacked, how will you respond? Who will inform the ICO, parents, or the DfE? Your incident response plan should include:

  • Clear roles and responsibilities (including DPL, Headteacher, IT lead)
  • Communication templates
  • Steps for isolation, containment, and recovery
  • A reporting mechanism to the ICO within 72 hours (as required under UK GDPR)
  1. Promote a ‘Whole School’ Security Culture

Cybersecurity isn’t just an IT issue; it’s an organisational culture issue. Consider adding cybersecurity awareness to staff induction, governor briefings, and safeguarding policies.

Security Is a Shared Responsibility

The NCSC and DfE have made clear that schools must prioritise cyber resilience alongside safeguarding and curriculum planning. For DPLs and school administrators, this means moving beyond compliance and toward proactive, strategic risk management.

The stakes are high, but so is the opportunity to lead. By investing in prevention, awareness, and preparation, your school can protect both its people and its purpose.

Imagine you’re sat in the staffroom, half-eaten biscuit in one hand, half-finished student spreadsheet on the screen. You scroll down a list of names, notes, and numbers, some of which, to your horror, date back to the year David Cameron was still Prime Minister.

You mutter to yourself, Why do we still have this data? Is this even legal?

Well, you’re in luck. Help may be on the horizon, wrapped in bureaucracy yes, but still help.

The UK’s Data (Use and Access) Bill is the government’s latest attempt to bring data protection into the 21st century without sending educators and administrators into panic-induced paper purges. So, what does it mean for you and your school? Let’s unpack it without the legalese, and with your sanity in mind.

So… What Is This Bill Actually About?

At its core, the Data (Use and Access) Bill (or DUAB, if you’re into acronyms) is about striking a balance between making better use of data and protecting people’s rights, especially children’s.

It’s part of the government’s broader post-Brexit effort to move away from some of the more rigid elements of EU GDPR, while still holding onto the values that matter; transparency, safety, and accountability. Think of it as GDPR’s sensible cousin, still serious, but a little more practical in the school setting.

Why Should Schools Care?

Here’s the thing, schools are absolute treasure troves of personal data. From safeguarding notes and behavioural logs to dinner money apps and biometric attendance systems, they gather data like Year 7s gather Pokémon cards, except there’s legal liability attached.

This Bill is nudging us gently but firmly towards smarter, clearer, and more responsible data use. For instance, it’s placing extra emphasis on how we use children’s data in digital tools and platforms. That means reviewing whether educational software is using personal information appropriately, and not quietly siphoning it off to train some mysterious AI model in the background.

Also under the spotlight? Automated decision-making. If you’ve ever wondered whether a student’s algorithmic “progress tracker” is making assumptions you wouldn’t make as a teacher… well, the DUAB has your back. It demands transparency and human oversight when important decisions are being made based on data. Because let’s face it, no algorithm knows your pupils like you do.

But Wait, There’s More…

One of the big ideas in the DUAB is around data retention. Remember that ancient spreadsheet I mentioned earlier? Under the Bill, keeping data “just in case” won’t cut it anymore. Schools will need clear justifications for how long data is kept and must be able to show they’re not hoarding it unnecessarily. It’s like a spring clean, but for your school server.

The Bill also introduces measures to simplify compliance. For schools, this could mean fewer hoops to jump through when working with third-party apps or local authorities, as long as the data use aligns with the public good and proper protections are in place.

So, What Should We Do Now?

First off, don’t panic. This Bill isn’t a ticking time bomb. It’s more of a nudge to think seriously about how we treat data in our schools and to embed that into our day-to-day decision-making.

It’s a good time to:

  • Talk to your school’s Data Protection Officer (you know, the one who pops up every year reminding everyone about GDPR).
  • Review your school’s data retention schedule – are you keeping stuff longer than necessary?
  • Ask questions about any new edtech platforms you’re trialling. Are they transparent? Safe for students? Do they actually need all the information they’re collecting?

And finally, keep the conversation going. Data protection isn’t just a compliance issue, it’s about trust. Parents trust us with their children. Students trust us with their futures. Managing their data responsibly is part of honouring that trust.

Ultimately, this isn’t just a policy update, it’s a cultural shift. The DUAB reminds us that data is more than a digital asset. It’s personal. It’s powerful. And in education, it’s deeply human.

So next time you open a spreadsheet that hasn’t been touched since the last Ofsted inspection, take a moment. Ask yourself not just “Do we need this?” but also “Is keeping this still respectful to the person behind the data?”

Because in the classroom, in the office, or even in the server room, one truth remains: good education starts with good ethics.

It’s not every day you hear about global privacy treaties in the staffroom. Between lesson plans, playground duties, and wondering why the photocopier only jams when you’re in a hurry, international data agreements don’t always make it onto the radar.

But every now and then, something big happens that’s worth pausing for and, Global CAPE is one of those somethings.

So let’s break it down. No jargon, no legal waffle, just a clear, narrative explanation of what’s going on and why it matters to schools.

First Things First: What on Earth is Global CAPE?

Global CAPE stands for the Global Cooperation Arrangement for Privacy Enforcement. Think of it like an international support group but for data protection authorities. It’s a framework that allows privacy regulators from different countries to work together more easily, especially when investigating cross-border data misuse.

Why is this needed? Because in 2024, data doesn’t stay local. A student might use an education app built in California, hosted in Ireland, with customer support in Singapore. If something goes wrong with how that app handles personal information, who’s in charge?

That’s where Global CAPE comes in. It makes it easier for regulators to:

  • Share information securely,
  • Cooperate on investigations, and
  • Support each other when tackling global privacy issues.

In short, it gives watchdogs more teeth and a few more colleagues to back them up.

So, Why Has the ICO Joined?

Earlier this year, the UK’s Information Commissioner’s Office (ICO) officially joined Global CAPE. It’s part of a growing list of data protection authorities who recognise that privacy is no longer a purely domestic issue.

For the ICO, this move means:

  • More muscle in investigating international companies that may misuse UK citizens’ data.
  • A seat at the table when setting expectations for global data handling, especially for emerging tech.
  • Better collaboration with other countries to address risks affecting UK children and families.

As the ICO put it, this step helps ensure that UK citizens remain protected, even when their data travels the world. And let’s be honest, when it comes to edtech and online tools in schools, data is travelling the world.

Who Else Is in This Data Protection Dream Team?

The list of members includes privacy authorities from:

  • The United States (particularly the Department of Commerce)
  • Australia
  • Canada
  • Japan
  • South Korea
  • Mexico
  • The Philippines
  • Singapore
  • And now, the United Kingdom

It’s a truly international group and one that continues to grow.

Why Should Educators Care?

At this point, you might be thinking: Okay, but how does this affect me, the classroom teacher or school leader?

Here’s the link.

As more schools adopt cloud-based learning platforms, communication apps, and AI-powered tools, the question of where data goes and who is accountable when things go wrong is more important than ever.

When you use an app to track student behaviour, does the data stay in the UK? What happens if that company is based abroad and suffers a breach?

Global CAPE means the ICO can now collaborate more effectively with overseas regulators. That adds a layer of reassurance for schools, especially those using international tools and platforms.

And for school leaders, it sends a gentle but important signal that data protection isn’t just a tick-box, it’s a global issue. The choices you make around platforms, permissions, and parental consent really do matter.

In the grand scheme of education, Global CAPE might not change your day-to-day immediately. But it’s part of a wider story: one where governments are finally realising that digital rights, especially for children, need international protection.

And while your focus may rightly stay on helping students grow, learn, and thrive, you can also be confident that the data trail they leave behind is being watched over by more than just your school server.

Because privacy doesn’t stop at the school gate anymore and now, neither does enforcement.

You’ve probably heard it by now, EdTech is booming. From lesson-planning AI to real-time behaviour tracking, schools across the UK are embracing technology faster than ever. Whether it’s a new learning app or a full-blown Management Information System (MIS), the promise is the same: smarter classrooms, less admin, and happier teachers.

But here’s the thing, every time we bring in a new tool, we also bring in new responsibilities, especially when it comes to data protection. For schools, ensuring any technology used complies with the UK General Data Protection Regulation (UK GDPR) is not just good practice; it’s a legal obligation.

EdTech solutions, especially those powered by AI, often rely on vast quantities of pupil data to function effectively. This may include:

  • Personal identifiers (e.g., name, date of birth, student ID)
  • Behavioural data (e.g., clicks, interactions)
  • Academic records and performance metrics
  • Special educational needs (SEN) information

If not properly safeguarded, the processing of such data can expose schools to legal, reputational, and ethical risks.

Let’s walk through what this means in practice, and how school leaders and Data Protection Officers (DPOs) can make sure their school stays compliant with UK GDPR.

A Quick Story: “We’re Getting a New MIS!”

Imagine this:

A secondary school in Manchester is rolling out a shiny new MIS platform. It promises everything; attendance tracking, timetabling, safeguarding notes, SEND support, and even parent communications, all under one digital roof.

Everyone’s excited. The SLT’s impressed. The IT manager loves the interface. Staff are dreaming of fewer spreadsheets. But then the DPO raises a hand:

“Have we done a data protection impact assessment yet?”

Cue the room going quiet.

This scenario plays out more often than you’d think. New tech comes in fast, but data protection often lags behind, or worse, gets missed entirely. So how do we avoid that?

Step 1: Start with the Right Questions

Before rolling out any new EdTech or AI tool, ask:

  • What kind of data will this tool collect?
  • Where will that data be stored, and for how long?
  • Has the supplier given us a clear privacy notice?
  • Do we need a Data Protection Impact Assessment (DPIA)?

(Hint: if the system processes special category data or monitors students at scale, as most MIS platforms do, the answer is almost certainly yes.)

Step 2: Pre-Vetting Checks – Your EdTech Compliance Toolkit

Whether you’re reviewing a new reading app or a full MIS, these checks will help you make sure the supplier is up to standard:

Data Processing Agreement (DPA)

Every third-party supplier must sign a DPA with your school. It should clearly lay out:

  • What data is being processed and why
  • Who is responsible for what
  • How long the data is kept
  • What happens at the end of the contract

Lawful Basis

Can the supplier justify why they’re processing pupil data? Schools usually rely on public task, but some EdTech tools, especially optional ones, may need consent. Be wary if it’s not clear.

Data Minimisation

Does the tool only collect what it needs? Or is it asking for extra fields “just in case”? Push back on anything that feels excessive.

Hosting and Security

Is the data stored in the UK or a country with an adequacy decision? Ask if they have:

  • Encryption at rest and in transit
  • Access controls
  • ISO 27001 or equivalent certifications
  • A breach response process

Transparency for Pupils and Parents

Can parents understand what data is collected and why? Suppliers should provide plain-English privacy policies, and so should your school.

Rights and Deletion

Can users (or the school) delete data easily if needed? Are retention periods clearly set out?

Step 3: Don’t Forget AI-Specific Risks

AI tools in EdTech often involve profiling or automated decision-making. Before using them:

  • Ask how the algorithms work (and whether human oversight is possible)
  • Check whether the tool could make significant decisions about students, like predicting attainment levels, or flagging safeguarding risks
  • Make sure pupils’ rights under Article 22 (automated decision-making) are respected

Step 4: Review Existing Tools Too

It’s not just about new tech. Many schools have tools they’ve used for years that may no longer meet today’s standards. Schedule regular audits to:

  • Check for feature creep (new functions = new risks)
  • Revisit supplier agreements
  • Reassess DPIAs
  • Make sure any changes to data use are reflected in your privacy notices

Let’s Get the Balance Right

We all want to give our pupils the best experience, and sometimes that means embracing innovation. But good data protection isn’t about blocking progress. It’s about asking the right questions before a breach or complaint happens.

As a DPO or senior leader, you don’t have to say no to every new tool. You just need to make sure the supplier (and the school) are doing things properly, within the law, ethically, and with children’s best interests in mind.

Remember: If in doubt, ask. Talk to your local authority, your MAT data protection lead, or a privacy professional. Protecting pupil data is everyone’s responsibility and with a little due diligence, your school can be both innovative and compliant.