Posts

GDPR Sentry can help you fill the knowledge gap

As cyber threats escalate across all sectors, UK schools have become increasingly frequent targets. From ransomware attacks that cripple entire networks to phishing campaigns aimed at staff and suppliers, the education sector is now considered a prime target for cybercriminals. According to the UK’s National Cyber Security Centre (NCSC), the volume and sophistication of attacks on schools have been rising steadily since 2020, with 2024 seeing one of the highest annual surges yet.

For Data Protection Leads (DPLs) and school administrators, the message is clear: safeguarding your community’s data is no longer just a GDPR requirement, it’s a critical frontline defence for education continuity, reputation, and trust.

Why Are Schools in the UK Being Targeted?

Schools manage vast amounts of sensitive data, from student records and safeguarding information to financial details and health data, all governed under the UK GDPR. At the same time, many schools rely on legacy systems, under-resourced IT infrastructure, or lack full-time cybersecurity expertise.

Threat actors know this. They exploit gaps in security awareness, outdated software, and insufficient incident response planning.

In recent months, several UK schools have reported:

  • Ransomware attacks that encrypted entire networks, halting teaching and learning for days
  • Phishing scams impersonating school leadership or DfE officials
  • Data breaches that triggered investigations by the Information Commissioner’s Office (ICO)
  • DDoS attacks during exam periods, disrupting access to remote systems

These aren’t hypothetical risks, they’re happening now.

What Can UK Schools Do to Minimise Risk?

As a DPL or school leader, you’re in a unique position to lead both compliance and culture. Below are seven actionable steps to significantly strengthen your school’s cybersecurity posture:

  1. Embed Cybersecurity into Data Protection Training

Cybersecurity and data protection go hand in hand. Ensure all staff, including teaching assistants and office staff, receive regular, mandatory training on:

  • Identifying phishing emails
  • Secure handling of pupil data
  • Using strong, unique passwords
  • What to do if they click on something suspicious
  1. Implement Multi-Factor Authentication (MFA) Across Systems

If you’re still using single-sign-on credentials for MIS, payroll, or email systems, now is the time to act. MFA drastically reduces the risk of unauthorised access, especially in cloud-based systems like Google Workspace for Education or Microsoft 365.

  1. Keep Software and Devices Updated

Cybercriminals often exploit outdated software with known vulnerabilities. Set systems to automatically install updates where possible. This includes:

  • Operating systems (Windows, macOS, ChromeOS)
  • Web browsers
  • MIS and safeguarding software
  • Antivirus and firewall tools

Work with your IT provider to audit devices used by staff working remotely.

  1. Back Up Data Securely and Test Recovery

Regular, encrypted backups, stored separately from your main network can be the difference between recovery and disaster. But backups only help if they’re tested.

Schedule termly backup recovery tests with your IT team or managed service provider.

  1. Review Third-Party Data Sharing

Many schools use third-party edtech tools. Ensure that suppliers:

  • Comply with UK GDPR
  • Have robust cybersecurity practices
  • Are listed in your Record of Processing Activities (ROPA)

Review contracts and data sharing agreements annually.

  1. Create and Test an Incident Response Plan

If your school is attacked, how will you respond? Who will inform the ICO, parents, or the DfE? Your incident response plan should include:

  • Clear roles and responsibilities (including DPL, Headteacher, IT lead)
  • Communication templates
  • Steps for isolation, containment, and recovery
  • A reporting mechanism to the ICO within 72 hours (as required under UK GDPR)
  1. Promote a ‘Whole School’ Security Culture

Cybersecurity isn’t just an IT issue; it’s an organisational culture issue. Consider adding cybersecurity awareness to staff induction, governor briefings, and safeguarding policies.

Security Is a Shared Responsibility

The NCSC and DfE have made clear that schools must prioritise cyber resilience alongside safeguarding and curriculum planning. For DPLs and school administrators, this means moving beyond compliance and toward proactive, strategic risk management.

The stakes are high, but so is the opportunity to lead. By investing in prevention, awareness, and preparation, your school can protect both its people and its purpose.

12 Months, 30+ Enhancements, and One Seriously Upgraded System – Here’s Sentry’s Year in Review

Over the past year, we’ve been busy building, refining, and supercharging your system with over 30 significant improvements, each one designed to make your experience more powerful, more intuitive, and, dare we say, a little more enjoyable. Whether you’ve spotted the subtle tweaks or just logged in one day to find everything working a bit more beautifully, the last 12 months have been all about taking your feedback and transforming it into action.

Here are some of the headline acts from this year’s development setlist:

‘Key Statistics’ Feature – Consider this your new go-to dashboard. Real-time, at-a-glance insights that turn data into insightful graphs without the spreadsheet headache.

New Audit Log – Ever wondered who changed what and when? Wonder no more. Our new audit log feature provides a crystal-clear trail of new entries, so you can track activity like a pro.

Multi-Factor Authentication (MFA) – Because security shouldn’t be optional. You can now enable MFA for that extra layer of protection. Peace of mind, powered by best practice.

System User Manual Integration – The latest guidance is now right where you need it: inside the system. No more digging through emails or outdated PDFs. If it’s changed, you’ll know it.

System-Wide Refinements – We’ve fine-tuned just about everything: updated reporting options, smarter selection fields, improved user management tools, and enhancements across every corner of the system. Cleaner layouts, better flows, and more “oh-that’s-useful” moments.

These updates are more than just tweaks; they’re part of our ongoing commitment to giving you a system that grows with your needs and evolves with the times.

To those of you who’ve sent feedback, asked the tough questions, or pointed out the things that could be even better – thank you. You’ve helped shape every one of these updates, and we’re excited for what’s coming next.

So here’s to progress, partnership, and platforms that just keep getting better.

You’ve probably heard it by now, EdTech is booming. From lesson-planning AI to real-time behaviour tracking, schools across the UK are embracing technology faster than ever. Whether it’s a new learning app or a full-blown Management Information System (MIS), the promise is the same: smarter classrooms, less admin, and happier teachers.

But here’s the thing, every time we bring in a new tool, we also bring in new responsibilities, especially when it comes to data protection. For schools, ensuring any technology used complies with the UK General Data Protection Regulation (UK GDPR) is not just good practice; it’s a legal obligation.

EdTech solutions, especially those powered by AI, often rely on vast quantities of pupil data to function effectively. This may include:

  • Personal identifiers (e.g., name, date of birth, student ID)
  • Behavioural data (e.g., clicks, interactions)
  • Academic records and performance metrics
  • Special educational needs (SEN) information

If not properly safeguarded, the processing of such data can expose schools to legal, reputational, and ethical risks.

Let’s walk through what this means in practice, and how school leaders and Data Protection Officers (DPOs) can make sure their school stays compliant with UK GDPR.

A Quick Story: “We’re Getting a New MIS!”

Imagine this:

A secondary school in Manchester is rolling out a shiny new MIS platform. It promises everything; attendance tracking, timetabling, safeguarding notes, SEND support, and even parent communications, all under one digital roof.

Everyone’s excited. The SLT’s impressed. The IT manager loves the interface. Staff are dreaming of fewer spreadsheets. But then the DPO raises a hand:

“Have we done a data protection impact assessment yet?”

Cue the room going quiet.

This scenario plays out more often than you’d think. New tech comes in fast, but data protection often lags behind, or worse, gets missed entirely. So how do we avoid that?

Step 1: Start with the Right Questions

Before rolling out any new EdTech or AI tool, ask:

  • What kind of data will this tool collect?
  • Where will that data be stored, and for how long?
  • Has the supplier given us a clear privacy notice?
  • Do we need a Data Protection Impact Assessment (DPIA)?

(Hint: if the system processes special category data or monitors students at scale, as most MIS platforms do, the answer is almost certainly yes.)

Step 2: Pre-Vetting Checks – Your EdTech Compliance Toolkit

Whether you’re reviewing a new reading app or a full MIS, these checks will help you make sure the supplier is up to standard:

Data Processing Agreement (DPA)

Every third-party supplier must sign a DPA with your school. It should clearly lay out:

  • What data is being processed and why
  • Who is responsible for what
  • How long the data is kept
  • What happens at the end of the contract

Lawful Basis

Can the supplier justify why they’re processing pupil data? Schools usually rely on public task, but some EdTech tools, especially optional ones, may need consent. Be wary if it’s not clear.

Data Minimisation

Does the tool only collect what it needs? Or is it asking for extra fields “just in case”? Push back on anything that feels excessive.

Hosting and Security

Is the data stored in the UK or a country with an adequacy decision? Ask if they have:

  • Encryption at rest and in transit
  • Access controls
  • ISO 27001 or equivalent certifications
  • A breach response process

Transparency for Pupils and Parents

Can parents understand what data is collected and why? Suppliers should provide plain-English privacy policies, and so should your school.

Rights and Deletion

Can users (or the school) delete data easily if needed? Are retention periods clearly set out?

Step 3: Don’t Forget AI-Specific Risks

AI tools in EdTech often involve profiling or automated decision-making. Before using them:

  • Ask how the algorithms work (and whether human oversight is possible)
  • Check whether the tool could make significant decisions about students, like predicting attainment levels, or flagging safeguarding risks
  • Make sure pupils’ rights under Article 22 (automated decision-making) are respected

Step 4: Review Existing Tools Too

It’s not just about new tech. Many schools have tools they’ve used for years that may no longer meet today’s standards. Schedule regular audits to:

  • Check for feature creep (new functions = new risks)
  • Revisit supplier agreements
  • Reassess DPIAs
  • Make sure any changes to data use are reflected in your privacy notices

Let’s Get the Balance Right

We all want to give our pupils the best experience, and sometimes that means embracing innovation. But good data protection isn’t about blocking progress. It’s about asking the right questions before a breach or complaint happens.

As a DPO or senior leader, you don’t have to say no to every new tool. You just need to make sure the supplier (and the school) are doing things properly, within the law, ethically, and with children’s best interests in mind.

Remember: If in doubt, ask. Talk to your local authority, your MAT data protection lead, or a privacy professional. Protecting pupil data is everyone’s responsibility and with a little due diligence, your school can be both innovative and compliant.

Let’s face it, school leaders today wear a lot of hats.

One minute, you’re supporting staff wellbeing; the next, you’re signing off on an EdTech contract, responding to a Subject Access Request, or checking if the new Wi-Fi rollout has encryption (whatever that means, right?).

In today’s increasingly digital world, two terms often crop up: Information Management (or data protection) and Information Security.

They sound similar, and they are closely connected, but they’re not the same. Both are essential in keeping your school’s data safe, legal, and well-managed. Understanding the difference can help you ask the right questions, delegate responsibilities wisely, and build a strong culture of trust and compliance across your school.

Let’s unpack what each one means, and why both matter equally.

What’s the Difference?

Information Management (Data Protection)

This is about how personal data is collected, used, stored, shared and deleted in line with laws like the UK GDPR and the Data Protection Act 2018. It’s focused on the rights of individuals (like pupils, parents and staff) and ensuring their personal data is treated fairly and lawfully.

Think of it as the “legal and ethical brain” behind how information flows through your school.

Examples:

  • Making sure parental consent is collected for use of a pupil’s photo
  • Responding to Subject Access Requests within the required timeframe
  • Having a clear retention schedule (so you’re not holding on to pupil data for 25 years “just in case”)
  • Ensuring only authorised staff can access safeguarding notes or health records

Lead roles: Usually the Data Protection Officer (DPO) or a senior leader with compliance responsibilities.

Information Security

Information security is about the technical and organisational measures you take to protect information from loss, damage, unauthorised access, or theft, whether it’s stored on paper, a laptop, or in the cloud.

It’s the “digital and physical shield” that keeps your systems and data safe.

Examples:

  • Encrypting devices and backing up files
  • Using strong passwords and locking screens
  • Preventing ransomware attacks
  • Ensuring staff don’t email personal data to the wrong recipient

Lead roles: Typically your IT manager, network team, or a designated security officer, often working closely with the DPO.

Aren’t They Completely Separate?

Not quite.

While information management and information security are distinct disciplines with different focuses, they are both key components of compliance under the UK GDPR.

In fact, the UK GDPR specifically requires organisations (including schools) to:

  • Process personal data lawfully, fairly and transparently (that’s information management)
  • Implement appropriate technical and organisational measures to keep data secure (that’s information security)
  • Be able to demonstrate accountability across both areas

So while they each require different expertise, they’re two sides of the same coin when it comes to protecting personal data.

If you have a great privacy policy but your systems are wide open to cyber threats you’re not GDPR compliant. And vice versa: even bulletproof IT security can’t cover for poor practices around data sharing, consent, or retention.

Why You Need Both

Let’s say that your school introduces a new wellbeing platform for pupils.

  • You’ve reviewed its privacy notice
  • You’ve completed a DPIA
  • You’ve told parents how the data will be used

But…

  • Staff are accessing it using shared logins
  • The password is “admin123”
  • You haven’t enabled two-factor authentication

You’ve done your information management well but failed on information security. That could still result in a data breach.

On the flip side, imagine the platform is highly secure; encrypted, password protected, hosted in the UK, but the school didn’t check the legal basis for processing or review the contract terms.

Now you’ve got a data protection problem.

Bottom line?
You need both working together to meet your responsibilities, legally and ethically.

 

So What Does Good Practice Look Like in a School?

Here’s a blended checklist of best practices to help keep your school safe, compliant and prepared:

Do Regular Data Audits

  • Know what personal data you hold, where it’s stored, why you need it, and how long you’re keeping it.
  • Review systems, spreadsheets, email lists, and apps, not just paper records.

Train Staff in Both Areas

  • Teach all staff the basics of data protection and information security, from recognising phishing emails to understanding how to respond to a Subject Access Request.
  • Tailor training for higher-risk roles (e.g. safeguarding, admin, SEND).

Lock It Down

  • Use strong passwords, screen locks, and encrypted devices.
  • Remove access for staff who no longer need it (or have left the school).
  • Consider multi-factor authentication for sensitive systems like MIS or safeguarding platforms.

Review and Share Clear Policies

  • Acceptable use, email and internet use, breach reporting, retention and disposal policies, these shouldn’t be buried on your intranet.
  • Keep them short, practical, and jargon-free.

Don’t Keep Data “Just in Case”

  • Apply your retention schedule and securely delete or archive data once it’s no longer needed.
  • Shred paper records and securely wipe devices.

Be Breach-Aware

  • Know what a breach is (It’s not just data hacking, sending data to the wrong person counts too!)
  • Have a simple breach reporting process that all staff understand
  • Keep a breach log and review it regularly with SLT and your DPO

Shared Responsibility

Data protection isn’t just the DPO’s job. And security isn’t just for the IT team.

Every member of staff has a part to play, from the headteacher to the lunchtime supervisor. By making these two areas part of your everyday school culture, you create a safer environment for your staff, your students, and your wider community.

If you’re not sure where your school stands on data protection or security, you’re not alone. Many schools benefit from a joint review with their DPO, IT team, and SLT, looking at risks, roles, and readiness.

If you’re looking to strengthen both areas, consider:

  • Running a joint INSET session on “Data Protection + Cyber Hygiene”
  • Reviewing your breach log together with IT and DPO staff

Booking an external audit of your information governance and security setup

Over the past 12 months, we’ve been busy behind the scenes, rolling out exciting new features, fine-tuning your favourite tools, and delivering enhancements that make your day-to-day work a little bit smoother (and a lot more efficient). Whether it’s a shiny new module or subtle improvements to existing systems, we’ve been listening to your feedback and turning it into action.

So, what’s new in our tech treasure trove?

A brand-new User Manual to guide you through every step, An incredible, new, Complaints module to streamline your processes, New data fields in the reporting of SARs, Breaches, and FOIs to give you even more control and clarity. Oh, and if you’ve ever wished for a cleaner, more intuitive layout you’ll love the updated Group Summary section.

Plus, for those “I need help right now” moments, we’ve made sure you’re covered with fully comprehensive help sections that are always just a click away.

Intrigued? Want to know more?

Existing customers can contact their dedicated Customer Success Manager for more information on all of the latest updates, or even a “show and tell”.

If you’re new to Sentry, why not book a Demo of our system. Its quick and simple, just click on “book a demo” on the homepage and choose a date and time that suits you.

There’s a certain kind of email that arrives in a school inbox that immediately raises eyebrows. It starts with something like:

“Exciting news! We’re trialling new biometric scanners in the canteen to speed up lunch queues!”

It’s followed by promises of efficiency, reduced lunch line chaos, and fewer forgotten PINs. On the surface, it sounds brilliant. Who wouldn’t want a futuristic solution to an age-old problem?

But here’s the thing: before you ask a group of eleven-year-olds to hand over their fingerprints for a chicken nugget, you need to stop and ask a bigger question… Have we done a Data Protection Impact Assessment (DPIA)?

You may wonder why it is so important. A DPIA isn’t just some bureaucratic hoop to jump through. It’s a vital safeguard designed to help schools understand how a new system or process might affect people’s privacy, especially when you’re dealing with sensitive or high-risk data.

In schools, we hold data about children who are arguably some of the most vulnerable individuals in society. Introducing new tech that collects biometric data (like fingerprints or facial recognition) raises serious privacy concerns. Biometric data is classed as “special category data” under the UK GDPR, which means it requires extra care and justification.

A DPIA helps you figure out: What data is being collected, why you need it, what risks it poses to individuals and, how to mitigate those risks. Even more crucially, it helps you decide whether the shiny new system is really necessary in the first place.

Let’s return to that canteen scanner idea. The supplier promises that fingerprinting pupils will slash queue times and reduce cash handling. Sounds efficient, right?

But have we asked:

  • Do we really need biometric data for this?
  • Could a swipe card or QR code achieve the same result with less risk?
  • What happens if a student refuses to give their fingerprint?
  • How securely will this data be stored and, who can access it?

Without a DPIA, these questions may never even surface.

Or take another example: your school is rolling out a new online safeguarding tool that uses artificial intelligence to flag potential risks based on student writing. Impressive? Maybe. Intrusive? Potentially. A DPIA would help you assess whether the tool’s benefits outweigh the privacy implications, and what safeguards should be in place.

Remember… behind every “data point” is a real child. Their birthday. Their behaviour record. Their image. Their fingerprint.

A DPIA isn’t about red tape. It’s about respecting the trust families place in us. It’s about making thoughtful, informed choices, not just because it’s the law, but because it’s the right thing to do.

And honestly, it’s also about protecting your school. If things go wrong, if a data breach happens, or parents push back, a completed DPIA shows you took privacy seriously. It shows you were proactive, not reactive.

A Culture Shift, Not a Paper Exercise

The best schools aren’t just doing DPIAs to tick a box. They’re building a culture where people ask early on:

“Could this new system affect how we handle personal data?”

“Do we need to speak to the Data Protection Officer before we go ahead?”

“Have we thought this through, not just for us, but for our students?”

That’s where real digital responsibility begins. Not in a policy document, but in everyday conversations.

So next time someone suggests a new app, platform, or process… pause. Before you roll it out, before the training sessions and the excited emails, check whether a DPIA is needed.

Because in a world where data is power, doing a DPIA is how we wield that power wisely. Not to impress with tech, not to dazzle with dashboards but, to protect, to consider, and to educate with integrity.

GDPR Sentry can help you fill the knowledge gap

Anyone involved in last year’s exam grade saga probably harbours a level of resentment against algorithms. 

The government formula was designed to standardise grades across the country. Instead, it affected students disproportionately, raising grades for students in smaller classes and more affluent areas. Conversely, students in poorer performing schools had their grades reduced, based on past grades from previous years.  

Most of us are well versed in the chaos that followed. Luckily, the government have already confirmed that this year’s results will be mercifully algorithm-free.  

We touched on the increased use of AI in education in an article last year.  Simple algorithms are already used to mark work in online learning platforms. Other systems can trawl through the websites people visit and the things that they write, looking for clues about poor mental health or radicalisation. Even these simple systems can create problems, but the future brings machine learning algorithms designed to support detailed decision making with major impacts on peoples lives. Many see Machine Learning as an incredible opportunity for efficiency, but it is not without its controversies.  

Image-generation algorithms have been the latest to cause issuesA new study from Carnegie Mellon University and George Washington University, found that unsupervised machine learning led to ‘baked-in biases’. Namely, the assumption that women simply prefer not to wear clothes. When researchers fed the algorithm pictures of a man cropped below his neck, 43% of the time the image was auto completed with the man wearing a suit. Researchers also fed the algorithm similarly cropped photographs of women. 53% of the time, it auto completed with a woman in a bikini or a low-cut top.  

In a more worrying example of machine-learning bias, A man in Michigan was arrested and held for 30 hours after a false positive facial recognition match. Facial recognition software has been found to be mostly accurate for white males but, for other demographics, it is woefully inadequate.  

Starring Cary Grant and Katherine Hepburn, Bringing up Baby follows a palaeontologist through his adventures with a scatter-brained heiress… and a leopard called Baby.

Where it all goes wrong:

These issues arise because of one simple problem, garbage in, garbage outMachine learning engines take mountains of previously collected data, and trawl through them to identify patterns and trends. They then use those patterns to predict or categorise new data. However, feed an AI biased data, and they’ll spit out a biased response.

An easy way to understand this is to imagine you take German lessons twice a week and French lessons every other month. Should someone talk to you in German, there’s a good chance you’ll understand, and be able to form a sensible reply. However, should someone ask you a question in French, you’re a lot less likely to understand, and your answer is more likely to be wrong. Facial recognition algorithms are often taught with a white leaning dataset. The lack of diversity means that when the algorithm comes across data from another demographic, it can’t make an accurate prediction.  

Coming back to image generation, the reality of the internet is that images of men are a lot more likely to be ‘safe for work’ than those of women. Feed that to an AI, and it’s easy to see how it would assume women just don’t like clothes.  

AI in Applications:

While there’s no denying that being wrongfully arrested would have quite an impact on your life, it’s not something you see every day. However, most people will experience the job application process. Algorithms are shaking things up here too.  

Back in 2018, Reuters reported that Amazon’s machine learning specialists scrapped their recruiting engine project. Designed to rank hundreds of applications and spit out the top five or so applicants, the engine was trained to detect patterns in résumés from the previous ten years.  

In an industry dominated by men, most résumés came from male applicants. Amazon’s algorithm therefore copied the pattern, learning to lower ratings of CVs including the word “women’s”. Should someone mention they captain a women’s debating team, or play on a women’s football team, their resume would automatically be downgraded. Amazon ultimately ended the project, but individuals within the company have stated that Amazon recruiters did look at the generated recommendations when hiring new staff 

Image of white robotic hand pointing at a polaroid of a man in a suit, with two other polaroids to the left and one to the right. The robot is selecting the individual in the picture they are pointing at.

Algorithms are already in use for recruitment. Some sift through CVs looking for keywords. Others analyse facial expressions and mannerisms during interviews.

Protection from Automated Processing:

Amazon’s experimental engine clearly illustrated how automated decision making can drastically affect the rights and freedoms of individuals. It’s why the GDPR includes specific safeguards against automated decision-making.  

Article 22 states that (apart from a few exceptions), an individual has the right not to be subject to a decision based solely on automated processing. Individuals have the right to obtain human intervention, should they contest the decision made, and in most cases an individual’s explicit consent should be gathered before using any automated decision making.  

This is becoming increasingly important to remember as technology continues to advance. Amazon’s experiment may have fallen through, but there are still AI-powered hiring products on the market. Companies such as Modern Hire and Hirevue provide interview analysis software, automatically generating ratings based on an applicant’s facial expressions and mannerisms. Depending on the datasets these products were trained on, these machines may also be brimming with biases.  

As Data Controllers, we must keep assessing the data protection impact of every product and every process. Talking to wired.co.ukIvana Bartoletti (Technical Director–Privacy at consultancy firm Deloitte) stated that she believed the current Covid-19 pandemic will push employers to implement AI based recruitment processes at “rocket speed”, and that these automated decisions can “lock people out of jobs”.

Battling Bias:

We live in a world where conscious and unconscious bias affects the lives and chances of many individuals. If we teach AI systems based on the world we have now, it’s little wonder that the results end up the same. With the mystique of a computer generated answer, people are less likely to question it. 

As sci-fi fantasy meets workplace reality (and it’s going to reach recruitment in schools and colleges first) it is our job to build in safeguards and protections. Building in a Human based check, informing data subjects, and completing Data Protection Impact Assessments are all tools to protect rights and freedoms in the battle against biased AI.  

Heavy stuff. It seems only right to finish with a machine learning joke: 

A machine learning algorithm walks into a bar… 

The bartender asks, “What will you have?” 

The  algorithm immediately responds, “What’s everyone else having?” 

 

The technologies used to process person data are becoming more sophisticated all the time.

This is the first article of an occasional series where we will examine the impact of emerging technology on Data Protection. Next time, we’ll be looking at new technologies in the area of remote learning.