You’ve probably heard it by now, EdTech is booming. From lesson-planning AI to real-time behaviour tracking, schools across the UK are embracing technology faster than ever. Whether it’s a new learning app or a full-blown Management Information System (MIS), the promise is the same: smarter classrooms, less admin, and happier teachers.

But here’s the thing, every time we bring in a new tool, we also bring in new responsibilities, especially when it comes to data protection. For schools, ensuring any technology used complies with the UK General Data Protection Regulation (UK GDPR) is not just good practice; it’s a legal obligation.

EdTech solutions, especially those powered by AI, often rely on vast quantities of pupil data to function effectively. This may include:

  • Personal identifiers (e.g., name, date of birth, student ID)
  • Behavioural data (e.g., clicks, interactions)
  • Academic records and performance metrics
  • Special educational needs (SEN) information

If not properly safeguarded, the processing of such data can expose schools to legal, reputational, and ethical risks.

Let’s walk through what this means in practice, and how school leaders and Data Protection Officers (DPOs) can make sure their school stays compliant with UK GDPR.

A Quick Story: “We’re Getting a New MIS!”

Imagine this:

A secondary school in Manchester is rolling out a shiny new MIS platform. It promises everything; attendance tracking, timetabling, safeguarding notes, SEND support, and even parent communications, all under one digital roof.

Everyone’s excited. The SLT’s impressed. The IT manager loves the interface. Staff are dreaming of fewer spreadsheets. But then the DPO raises a hand:

“Have we done a data protection impact assessment yet?”

Cue the room going quiet.

This scenario plays out more often than you’d think. New tech comes in fast, but data protection often lags behind, or worse, gets missed entirely. So how do we avoid that?

Step 1: Start with the Right Questions

Before rolling out any new EdTech or AI tool, ask:

  • What kind of data will this tool collect?
  • Where will that data be stored, and for how long?
  • Has the supplier given us a clear privacy notice?
  • Do we need a Data Protection Impact Assessment (DPIA)?

(Hint: if the system processes special category data or monitors students at scale, as most MIS platforms do, the answer is almost certainly yes.)

Step 2: Pre-Vetting Checks – Your EdTech Compliance Toolkit

Whether you’re reviewing a new reading app or a full MIS, these checks will help you make sure the supplier is up to standard:

Data Processing Agreement (DPA)

Every third-party supplier must sign a DPA with your school. It should clearly lay out:

  • What data is being processed and why
  • Who is responsible for what
  • How long the data is kept
  • What happens at the end of the contract

Lawful Basis

Can the supplier justify why they’re processing pupil data? Schools usually rely on public task, but some EdTech tools, especially optional ones, may need consent. Be wary if it’s not clear.

Data Minimisation

Does the tool only collect what it needs? Or is it asking for extra fields “just in case”? Push back on anything that feels excessive.

Hosting and Security

Is the data stored in the UK or a country with an adequacy decision? Ask if they have:

  • Encryption at rest and in transit
  • Access controls
  • ISO 27001 or equivalent certifications
  • A breach response process

Transparency for Pupils and Parents

Can parents understand what data is collected and why? Suppliers should provide plain-English privacy policies, and so should your school.

Rights and Deletion

Can users (or the school) delete data easily if needed? Are retention periods clearly set out?

Step 3: Don’t Forget AI-Specific Risks

AI tools in EdTech often involve profiling or automated decision-making. Before using them:

  • Ask how the algorithms work (and whether human oversight is possible)
  • Check whether the tool could make significant decisions about students, like predicting attainment levels, or flagging safeguarding risks
  • Make sure pupils’ rights under Article 22 (automated decision-making) are respected

Step 4: Review Existing Tools Too

It’s not just about new tech. Many schools have tools they’ve used for years that may no longer meet today’s standards. Schedule regular audits to:

  • Check for feature creep (new functions = new risks)
  • Revisit supplier agreements
  • Reassess DPIAs
  • Make sure any changes to data use are reflected in your privacy notices

Let’s Get the Balance Right

We all want to give our pupils the best experience, and sometimes that means embracing innovation. But good data protection isn’t about blocking progress. It’s about asking the right questions before a breach or complaint happens.

As a DPO or senior leader, you don’t have to say no to every new tool. You just need to make sure the supplier (and the school) are doing things properly, within the law, ethically, and with children’s best interests in mind.

Remember: If in doubt, ask. Talk to your local authority, your MAT data protection lead, or a privacy professional. Protecting pupil data is everyone’s responsibility and with a little due diligence, your school can be both innovative and compliant.