Posts

Picture this, a student who left school years ago reaches out with a request. They’d like their personal data removed from your records. Not just from the admissions system, but everything; emails, reports, maybe even that long-forgotten incident log from Year 9. They’ve moved on and want their past to do the same.

It’s a request that might feel particularly familiar as we approach the end of the academic year. Leavers are saying their goodbyes, staff are wrapping up reports, and across schools and universities, inboxes are starting to fill with data requests of all kinds. One that stands out: the Right to Be Forgotten.

Under the UK GDPR, individuals have the “Right to Be Forgotten,” meaning they can request the deletion of their personal data when it’s no longer necessary. On the surface, it’s empowering. People should have a say in how long their data is kept, and it makes sense that someone might want to shed parts of their past as they grow older.

But in education, where records aren’t just about individuals, they’re about safeguarding, progress, accountability, and shared experiences, it’s not always a straightforward decision.

When Forgetting Isn’t That Simple

Educators are natural record-keepers. Whether it’s for safeguarding, special educational needs, or even just tracking progress over time, information often needs to be preserved for years, sometimes decades. Deleting a student’s file isn’t like clearing out an old inbox, it could mean losing context, closing the door on long-term support, or even creating gaps that matter later on.

Take safeguarding, for instance. A school may need to retain certain records well into adulthood because they could one day form part of a disclosure or investigation. It’s not about holding on to the past unnecessarily, it’s about being prepared for the future.

So when a deletion request lands, it’s not simply a matter of yes or no. It’s about finding the balance between respecting someone’s right to move on, and the very real need to preserve records for legal, ethical, or pastoral reasons.

The Growing Digital Trail

Another challenge? The sheer amount of digital space education now occupies. Emails, learning platforms, CCTV, behaviour tracking software, it all adds up. And each of those systems might hold fragments of someone’s personal story.

In practice, this means schools and universities need to have a clear idea of where data lives, how long it needs to be kept, and when it can safely be deleted. That’s easier said than done, especially when older systems weren’t designed with the right to erasure in mind.

Still, we’re seeing encouraging progress. A number of schools have created systems for reviewing and safely deleting old data, ensuring they don’t keep more than they need to. Some universities have designed workflows that let them honour deletion requests without compromising core records like transcripts or misconduct reports. What’s key is having a clear, fair process and communicating it openly.

Conversations, Not Just Policies

The most effective responses to RTBF requests tend to start with a conversation, not a policy document. The best examples we’ve seen involve staff explaining clearly why some records must be kept, while also doing everything they can to remove unnecessary data. It’s about approaching each request with care and respect, rather than defensiveness or bureaucracy.

That human touch really matters. After all, people often ask to be “forgotten” not just out of privacy concerns, but because they want to close a chapter, sometimes after difficult experiences. Even when we can’t grant a full erasure, we can usually find ways to meet the spirit of the request with kindness and clarity.

So, Where Does That Leave Us?

The Right to Be Forgotten is both powerful and complex. It asks important questions about memory, responsibility, and care. And while it might feel tricky to navigate, it’s also a chance for schools and universities to reflect on the data they hold and why they hold it.

Are we keeping things because we need to? Or because we always have? Are we making space for people to move on? Or unintentionally anchoring them to a version of themselves they’ve long outgrown?

There are no perfect answers, and each case will be a little different. But what’s clear is that thoughtful, transparent processes and a bit of empathy go a long way. Forgetting, when it’s done well, can be an act of respect. And remembering, when necessary, can be an act of care.

If nothing else, it’s a good reminder that every bit of data we collect today might become part of someone’s story tomorrow. How we hold that story, and when we let it go, matters more than ever.

“Miss, what’s GDPR? Is it a new exam board?”

That question might have raised a chuckle back in 2018, when GDPR first arrived with its bundle of acronyms, policy updates, and general sense of urgency. But here we are in 2025, seven years on and the UK GDPR is no longer the new kid on the block. It’s settled in, taken its place alongside safeguarding, SEND, and all the other core responsibilities that shape everyday life in education.

The big question is: have we settled in with it?

Looking Back (and Forward)

When the GDPR first landed, there was a flurry of activity; privacy notices were redrafted, training sessions booked, and data audits launched with admirable enthusiasm. Since then, many schools, colleges and universities have found their rhythm with data protection. It’s become part of the background noise of school life: necessary, not always exciting, but undoubtedly important.

Still, while the panic may have subsided, that doesn’t mean the pressure has. In fact, the expectations have only grown. With the increasing use of digital tools in classrooms, the rise of online learning, and greater public awareness of privacy issues, it’s no longer enough to simply tick the GDPR box and carry on. The way we manage personal data has become part of how our communities judge our professionalism and trustworthiness.

And rightly so. In education, we hold some of the most sensitive information people will ever share; from learning needs and medical information, to safeguarding records and home circumstances. Protecting that data is part of the duty of care we owe to every pupil, student, parent, and colleague.

In recent years, we’ve seen both missteps and good practice across the sector. One well-known case involved a phishing email that led to a serious breach at a large multi-academy trust. Despite having the right policies on paper, the real problem turned out to be a lack of practical staff training. It was a simple mistake, but one with far-reaching consequences.

On the other hand, many institutions have shown what good looks like. Several universities, for example, now include GDPR awareness as part of induction for all staff, and make regular updates part of their professional development cycle. One even ran a student-led privacy campaign, helping young people understand their own rights while building a culture of shared responsibility. The message was clear: data protection isn’t just admin, it’s part of how we show care and respect.

What GDPR Means in 2025

We’re now working in a digital-first education landscape. Learning platforms, behaviour tracking systems, AI-driven learning tools, they all collect and process data in increasingly complex ways. GDPR hasn’t stood still either; the principles remain the same, but the questions we need to ask have evolved.

Are we being transparent with families and students about how their data is used? Are we confident the apps and platforms we rely on are genuinely secure and compliant? Are we sure that only the right people in our organisations can access sensitive information?

These aren’t questions for data managers alone. They’re questions for senior leaders, teachers, support staff – everyone who touches information in any form. Because GDPR is no longer just a legal requirement. In 2025, it’s part of how we show we’re trustworthy professionals.

Seven Years In: What’s Changed?

What’s changed, more than anything, is awareness. Students are more privacy-savvy. Parents are asking sharper questions. Staff are more alert to the risks and responsibilities of handling personal data.

And that’s a good thing.

It means we can shift the conversation from compliance to confidence. When GDPR is built into our culture, not just our policies, it becomes part of a wider approach to doing things well. Much like safeguarding, it becomes part of how we think, plan and care.

So, whether you’re updating a digital platform, emailing student records, or printing off that spreadsheet for a meeting, it’s worth pausing to reflect. Not in fear, but in thoughtfulness. Is this the safest way to handle this information? Do I need to do anything differently? Would I feel comfortable explaining this decision to a parent?

There’s no denying that GDPR doesn’t always feel urgent (until something goes wrong). But it’s one of those quiet responsibilities that says a lot about who you are as educators. It speaks to the trust placed in you, and the way you uphold it day after day.

Seven years on, we’ve come a long way. And with thoughtful leadership, practical systems, and a bit of shared awareness, we’ll keep moving in the right direction.

After all, data protection is really about people and, education has always been good at putting people first.

There’s something unmistakable in the air when an inspection is on the horizon. You can feel the hum of preparation in every corridor; policies being printed, classroom displays getting a refresh, and colleagues exchanging knowing glances over the photocopier. It’s all hands-on deck, and every detail matters.

But as walls are re-pinned and cupboards reorganised, another question often arises quietly in the background: Are we still within the bounds of data protection law?

The answer isn’t always as straightforward as we’d like. In fact, GDPR considerations often become most visible in the very things we proudly display on classroom walls, in shared corridors, or on digital screens. And during inspection season, those questions only feel more urgent.

Let’s consider a familiar example: a colourful “Star of the Week” board, complete with names, photos, and personal achievements. It’s a lovely way to celebrate success but what if one of those pupils has a parent who didn’t give consent for photos? Or a safeguarding concern that makes public identification risky? Even the most well-intentioned display can inadvertently stray into problematic territory.

The same applies to medical or allergy information. Many schools use posters or visual aids to make staff aware of pupil needs, particularly in lunch halls or near staff kitchens. But if that information includes photos, names, and medical conditions in areas accessed by other pupils or visitors, it crosses into “special category data” under UK GDPR. That kind of information requires extra care.

Digital spaces are no less important. In the rush to prepare documents, it’s easy to leave a screen open or a shared drive exposed. But if personal pupil data is left visible or accessible to those without a legitimate reason to view it, the school could find itself facing not only a privacy concern, but potentially a reportable data breach.

None of this means schools must remove every trace of student celebration or wrap the walls in plain paper. GDPR doesn’t ask us to stop recognising achievement, it asks us to think critically about how we do it.

One school I worked with ran what they called a “privacy walk” in the days leading up to an inspection. Staff took ten minutes to walk through shared spaces with fresh eyes, asking themselves: Can visitors see anything they shouldn’t? Are personal details on show unnecessarily? Are we being mindful with how we display medical or safeguarding information? It was a quick, simple exercise that made a measurable difference to their overall compliance, and their confidence, on inspection day.

Similarly, one teacher I spoke to found an elegant solution to a parent’s concern over public displays of progress. Instead of using names on her reward chart, she assigned each child an animal symbol; “Team Owl,” “Team Fox,” and so on. It respected privacy, kept parents satisfied, and the pupils embraced it wholeheartedly.

What matters most is that schools are able to demonstrate thoughtful, proportionate decision-making. Inspectors don’t expect perfection, but they do expect to see that staff understand the principles of data protection and have taken reasonable steps to comply.

If you’re unsure about a display, a chart, or a staffroom noticeboard, ask yourself: Is it necessary? Is it proportionate? And have we obtained the right consent, where needed? If you’re still unsure, speak to your data protection lead or DPO. Getting the answer right before inspection day is far easier than addressing concerns after the fact.

Ultimately, compliance isn’t about red tape, it’s about respect. Respecting pupils’ rights, respecting families’ expectations, and respecting the trust placed in schools to safeguard not only children, but their information.

So as inspection season gathers pace, take a moment to review the little things. The name tags, the photo walls, the charts with more detail than needed. Because when the inspector walks in and the questions start, you’ll be glad you did.

It’s that time of year again.

The weather’s warming up (at least in theory), the final exam papers are piling up in the staffroom, and the Year 11s, full of nervous energy, optimism, and just a hint of mischief, are preparing to say their goodbyes. It’s the season of leavers’ assemblies, nostalgic slide shows, and of course, the all-important question: “Can we get hoodies with all our names on them?”

Cue the GDPR panic.

Every year, in schools up and down the country, a familiar scenario plays out. A well-meaning teacher or member of the PTA offers to organise personalised hoodies or put together a yearbook featuring class photos and messages. And then, someone asks the question that stops the printer in its tracks: “Wait… is this even allowed under GDPR?”

Let’s unravel this together, because despite what some may believe, GDPR isn’t the fun police. It doesn’t mean you have to cancel prom or produce an anonymised yearbook with stick figures instead of class photos.

The truth is, most of these cherished school traditions can go ahead, so long as they’re handled with care, clarity, and a bit of common sense.

Take the humble leavers’ hoodie. It’s one of those rites of passage that students will cling to years after they’ve grown out of it. Names printed on the back, sometimes nicknames, sometimes surnames, sometimes the dreaded full first-middle-last-name combo. From a data protection point of view, names are indeed personal data. But does that mean you can’t print them?

Absolutely not. You just need a lawful basis to do it—and in most cases, that’s as simple as getting consent.

Whether it’s for hoodies, a yearbook, or a slideshow featuring baby photos, if you’re collecting and sharing personal data outside of core educational purposes, it’s best practice to ask students (or their parents, depending on age) for permission. A simple form will do the trick. The key is to be clear about what data you’re using, where it will appear, and who will see it. No tricks, no fine print in size six font.

And yes, you can still include photos. There’s no secret clause in the UK GDPR that says a picture of Year 11 on the school field at lunchtime is forbidden. If it’s for a yearbook, prom night collage, or school website tribute, the same principle applies; be transparent, get the appropriate permissions, and store images securely. That’s it.

There’s a myth that GDPR somehow outlawed all joy in schools, but the reality is it just asked us to stop being sloppy with data. It’s about respect, not restriction.

Then there’s the classic signing shirts. The ink-stained rite of passage, where uniforms are transformed into messy tributes of inside jokes and hastily scrawled farewells. A few educators have raised their eyebrows at this tradition, worrying it could constitute “uncontrolled data sharing.”

Realistically, if a student voluntarily hands their shirt to a friend and says, “write something embarrassing on my back,” this isn’t a data protection issue, it’s a social one. GDPR doesn’t govern private, student-to-student interactions unless the school is actively collecting and publishing that content. So, you don’t need to enforce a shirt-signing ban (and if you tried, good luck…).

Now, about prom. Some schools host their own; others let parents or external companies run the show. Either way, collecting names for tickets, dietary needs, or emergency contact details is fine, just make sure you’re only collecting what you actually need, and that the data isn’t floating around on someone’s USB stick or unprotected spreadsheet. The golden rule? If you wouldn’t want your own teen’s info handled that way, don’t do it to someone else’s.

So here’s the bottom line: don’t let GDPR myths steal the spotlight from your leavers’ celebrations. Data protection doesn’t mean you can’t celebrate your students. It just asks that you do it with intention.

After all, what better way to send off the next generation than by teaching them that privacy and parties can coexist?

Let them have their yearbooks. Let them wear hoodies emblazoned with names they’ll cringe at later. Let them dance the awkward final dance at prom. And let them remember that their school cared enough to protect their memories without over-policing their goodbyes.

There’s a certain kind of email that arrives in a school inbox that immediately raises eyebrows. It starts with something like:

“Exciting news! We’re trialling new biometric scanners in the canteen to speed up lunch queues!”

It’s followed by promises of efficiency, reduced lunch line chaos, and fewer forgotten PINs. On the surface, it sounds brilliant. Who wouldn’t want a futuristic solution to an age-old problem?

But here’s the thing: before you ask a group of eleven-year-olds to hand over their fingerprints for a chicken nugget, you need to stop and ask a bigger question… Have we done a Data Protection Impact Assessment (DPIA)?

You may wonder why it is so important. A DPIA isn’t just some bureaucratic hoop to jump through. It’s a vital safeguard designed to help schools understand how a new system or process might affect people’s privacy, especially when you’re dealing with sensitive or high-risk data.

In schools, we hold data about children who are arguably some of the most vulnerable individuals in society. Introducing new tech that collects biometric data (like fingerprints or facial recognition) raises serious privacy concerns. Biometric data is classed as “special category data” under the UK GDPR, which means it requires extra care and justification.

A DPIA helps you figure out: What data is being collected, why you need it, what risks it poses to individuals and, how to mitigate those risks. Even more crucially, it helps you decide whether the shiny new system is really necessary in the first place.

Let’s return to that canteen scanner idea. The supplier promises that fingerprinting pupils will slash queue times and reduce cash handling. Sounds efficient, right?

But have we asked:

  • Do we really need biometric data for this?
  • Could a swipe card or QR code achieve the same result with less risk?
  • What happens if a student refuses to give their fingerprint?
  • How securely will this data be stored and, who can access it?

Without a DPIA, these questions may never even surface.

Or take another example: your school is rolling out a new online safeguarding tool that uses artificial intelligence to flag potential risks based on student writing. Impressive? Maybe. Intrusive? Potentially. A DPIA would help you assess whether the tool’s benefits outweigh the privacy implications, and what safeguards should be in place.

Remember… behind every “data point” is a real child. Their birthday. Their behaviour record. Their image. Their fingerprint.

A DPIA isn’t about red tape. It’s about respecting the trust families place in us. It’s about making thoughtful, informed choices, not just because it’s the law, but because it’s the right thing to do.

And honestly, it’s also about protecting your school. If things go wrong, if a data breach happens, or parents push back, a completed DPIA shows you took privacy seriously. It shows you were proactive, not reactive.

A Culture Shift, Not a Paper Exercise

The best schools aren’t just doing DPIAs to tick a box. They’re building a culture where people ask early on:

“Could this new system affect how we handle personal data?”

“Do we need to speak to the Data Protection Officer before we go ahead?”

“Have we thought this through, not just for us, but for our students?”

That’s where real digital responsibility begins. Not in a policy document, but in everyday conversations.

So next time someone suggests a new app, platform, or process… pause. Before you roll it out, before the training sessions and the excited emails, check whether a DPIA is needed.

Because in a world where data is power, doing a DPIA is how we wield that power wisely. Not to impress with tech, not to dazzle with dashboards but, to protect, to consider, and to educate with integrity.

It’s opening night. The school hall smells faintly of paint and paper mâché. There’s a Year 5 pupil with a cardboard crown that’s just a little too large for their head, nervously adjusting their costume backstage. Parents are streaming in, phones at the ready, clinging to the best seats like it’s Glastonbury. You’ve made it to the school play and so has the annual data protection dilemma.

Because as predictable as last-minute prop malfunctions and forgotten lines, come the whispered queries: “Can I film this?” “What if someone else’s child is in the shot?” “Are we even allowed to take photos anymore?”

Ah yes, welcome to the wonderfully confusing world of data protection and school performances. Where nativity scenes meet nuanced legislation, and Mary’s not the only one cradling something precious.

Let’s start with the basics. Parents taking photos or videos for personal use? Absolutely fine. UK GDPR isn’t interested in mums and dads snapping a picture of their little star as the third shepherd from the left. That’s considered a “purely personal or household activity,” and data protection laws don’t apply. Parents can cheer, film, and Instagram away, within reason.

But let’s say a parent asks for a copy of the school’s official video of the play. Now we’ve stepped into a different category. If the school is recording or photographing the event, it’s processing personal data. That means GDPR applies. The school must be clear about what it’s capturing, why, and how that footage will be used or shared.

It’s here that things can get thorny.

For instance, imagine you’ve got a pupil in Year 4 whose parent has specifically requested their child not be photographed, perhaps due to safeguarding concerns. If that child ends up in the wide-angle shot of the final scene, and the video is later shared on the school’s website, that’s not just a mistake, it’s a potential data breach.

So schools have to tread carefully. It means thinking ahead. It means letting parents know in advance what will be filmed, how long the footage will be kept, and getting clear consent for public use, especially if the content might be shared beyond the school community.

Then there’s the grey area of social media. Suppose a proud grandparent posts a clip of the school play on Facebook, featuring multiple children in the background. No malice, no agenda, just pride. Still, if that video ends up widely circulated or accessible to people outside the immediate circle, concerns can start to surface. And suddenly, the school may get complaints from parents who hadn’t realised their child might appear in someone else’s family montage.

Educators often find themselves caught between celebrating achievements and navigating consent. You want to showcase the joy, the creativity, the culmination of weeks of rehearsal. But you also don’t want to inadvertently violate someone’s privacy or their trust.

So what can be done?

Communication, as always, is your best friend. Set expectations early. Let families know what the school’s policy is on filming and photography. Provide opportunities for opt-outs and be clear that personal recordings must not be posted publicly without consent from all those featured.

And if you’re recording the event as a school, make sure your privacy notices are up to date, your consents are meaningful, and your editing software is ready just in case someone needs to be cropped or blurred.

One school I worked with handled it beautifully: before the play, the headteacher gave a warm, informal announcement. “We know you’ll want to remember tonight,” she said. “Feel free to take photos of your own child, but please be mindful of others. Let’s celebrate the magic without forgetting that we all have different comfort levels.”

The audience appreciated the reminder. Phones were out, but respectfully so. And not a single complaint followed.

Ultimately, the aim isn’t to dampen the occasion, it’s to protect the people in it. Children deserve to shine on stage without worrying about where that footage might end up. And parents deserve clarity about how their children’s images are being used.

So as the lights dim and the narrator clears their throat, take a breath. You’ve got the play under control. And with a little forethought, you’ve got the data protection side covered too.

Break a leg, and maybe set your camera to “portrait mode.”

Picture this: You’re clearing out a dusty old cupboard in the staffroom and stumble across a stack of paper files labelled “Year 11 – 2009”. A mix of test scores, behavioural logs, and, oddly, a permission slip for a trip to Alton Towers.

Your first thought? How did this survive the last clear-out?

Your second? Should we even still have this?

If you’ve ever found yourself asking those questions, you’re not alone. But when it comes to managing personal data in schools, it’s not just about tidiness or storage space, it’s about legal responsibility, privacy, and respect for the individuals behind the information.

Let’s talk about data retention, and why getting it right is more than just best practice, it’s the law.

Data Isn’t Just Data. It’s Someone’s Life Story

In education, we collect a lot of data: names, addresses, medical notes, academic records, safeguarding files, staff performance reviews, you name it. And it all serves a purpose… for a time.

But once that purpose is fulfilled? Keeping it longer than necessary can be a breach of the UK GDPR and Data Protection Act 2018 (DPA).

The GDPR (General Data Protection Regulation), still part of UK law post-Brexit, tells us that personal data must be:

  • Accurate
  • Kept up to date
  • Not kept longer than necessary

In other words: Just because you have it, doesn’t mean you should still keep it.

“Just In Case” Isn’t a Policy

One of the most common phrases you’ll hear in schools when asking why old data still exists is:
“We might need it one day.” But the law says otherwise.

Every piece of personal data must have a defined retention period based on its purpose. These periods should be recorded in your data retention policy or information asset register, which should be reviewed regularly.

Let’s look at a few examples:

  • Safeguarding records? Kept until the child is 25 (or 6 years after the last entry if the child was not looked after).
  • Recruitment records for unsuccessful applicants? Typically 6 months.
  • Staff employment files? Usually retained for 6 years after employment ends.

These timeframes aren’t arbitrary. They’re based on legal, educational, and best practice guidance (such as from the IRMS toolkit for schools).

Imagine if a former pupil, now 30, asked for all the information you still held about them. Could you confidently justify why you still have that Year 8 report from 2006?

Or worse, what if their data was part of a breach and it turned out it should have been deleted a decade ago?

This isn’t just theoretical. Schools have been fined for poor data management practices, including keeping data for far longer than necessary.

So what can schools do to stay compliant and responsible?

Have a Clear Retention Schedule

Refer to sector-specific guidance (like the IRMS Records Management Toolkit) and document retention periods in your data protection policy.

Build a Culture of Data Hygiene

Make data deletion as routine as fire drills. Annual “digital spring cleans” can be helpful for reminding staff to review and remove old files.

Use Technology Wisely

Modern MIS and HR systems often allow automated data archiving or deletion after a set period. Use these tools, but make sure they’re configured correctly.

Train Your Staff

Teachers and admin staff are on the frontlines of data processing. Make sure everyone knows why data retention matters—and what they’re responsible for.

Final Thoughts: Respecting the Past Without Hoarding It

Data retention might not be the most glamorous part of running a school, but it is one of the most important for protecting your pupils, your staff, and your reputation.

Ultimately, it comes down to this: Respect the data as you would respect the person it belongs to.

You wouldn’t keep old student essays or report cards pinned to a noticeboard for years, so why keep their digital (or paper) equivalents indefinitely?

Managing data well isn’t just about compliance, it’s about ethics, trust, and good governance.

So next time you come across a file from five headteachers ago, ask yourself: Why do we still have this? And if there’s no good answer, it might be time to let it go.

It’s a typical Tuesday morning in the staffroom. Someone’s burnt their toast, the last tea bag has mysteriously vanished, and your inbox flashes up with a reminder: “Mandatory GDPR Refresher – 20 minutes.” There’s a quiet groan. Not because anyone doubts its importance but because, for many, data protection training sits firmly in the category of necessary but dry.

And yet, in schools, the relevance of GDPR couldn’t be more real. Far from being a background compliance exercise, it’s something woven into nearly every task we undertake whether we realise it or not. It’s in the way we send emails to parents, the way we store SEN reports, or how we display pupil names on classroom walls.

The truth is, GDPR awareness isn’t a one-off event. It’s a practice. And like all good practice, it requires routine reflection, updated understanding, and yes, refreshers.

Take, for example, a school that proudly circulated a birthday list to families in a class newsletter. A small act of celebration, warmly intended. But one child on the list was under a court order that required their identity to be protected. The result wasn’t malicious, but it did amount to a serious lapse in data handling, one that could have been avoided with more regular, scenario-based reminders.

Every member of staff in a school; teachers, support staff, lunchtime supervisors, even volunteers, comes into contact with personal data. That might be in the form of a safeguarding note, an attendance register, or a photo taken during a school trip. It’s not the presence of data that’s the issue, but how thoughtfully and lawfully it is used.

Regular GDPR training and awareness sessions provide the confidence and clarity staff need to navigate this landscape. They help reinforce the day-to-day decisions like locking screens, avoiding personal email use, or checking consent for photographs, that protect children’s rights and safeguard the school from reputational and legal risk.

Some schools are rethinking the format of these refreshers. One primary school incorporated short GDPR tips into their weekly staff briefings: “This week’s reminder is about using BCC in group emails.” It was informal, quick, and incredibly effective at keeping privacy principles front of mind without overwhelming staff.

Others have taken a more reflective approach, using anonymised real-life incidents from within the school to frame learning: “Remember when a report was accidentally emailed to the wrong parent?” These moments serve as powerful learning tools. They aren’t theoretical, they’re rooted in the real and immediate experience of the staff team.

In a world of competing priorities, it’s easy for GDPR to feel like a tick-box activity. But when an incident happens, be it a data breach, a complaint, or a safeguarding issue, it instantly becomes urgent and central. At that point, it’s not just about compliance. It’s about trust.

GDPR, at its core, is about respecting people, their privacy, their safety, their dignity. Educators are entrusted with not only children’s learning, but their stories, their vulnerabilities, and their personal details. That trust deserves care and vigilance, not just once a year, but as part of our professional mindset.

So, the next time a GDPR refresher request lands in your inbox, perhaps see it for what it is, a professional check-in that helps you protect your pupils, your school, and yourself. It’s not about ticking a box, it’s about reinforcing a culture of thoughtful, respectful data handling.

Because good data protection practice in schools isn’t about fear. It’s about professionalism, empathy, and safeguarding, both online and offline.

GDPR Sentry can help you fill the knowledge gap

Anyone involved in last year’s exam grade saga probably harbours a level of resentment against algorithms. 

The government formula was designed to standardise grades across the country. Instead, it affected students disproportionately, raising grades for students in smaller classes and more affluent areas. Conversely, students in poorer performing schools had their grades reduced, based on past grades from previous years.  

Most of us are well versed in the chaos that followed. Luckily, the government have already confirmed that this year’s results will be mercifully algorithm-free.  

We touched on the increased use of AI in education in an article last year.  Simple algorithms are already used to mark work in online learning platforms. Other systems can trawl through the websites people visit and the things that they write, looking for clues about poor mental health or radicalisation. Even these simple systems can create problems, but the future brings machine learning algorithms designed to support detailed decision making with major impacts on peoples lives. Many see Machine Learning as an incredible opportunity for efficiency, but it is not without its controversies.  

Image-generation algorithms have been the latest to cause issuesA new study from Carnegie Mellon University and George Washington University, found that unsupervised machine learning led to ‘baked-in biases’. Namely, the assumption that women simply prefer not to wear clothes. When researchers fed the algorithm pictures of a man cropped below his neck, 43% of the time the image was auto completed with the man wearing a suit. Researchers also fed the algorithm similarly cropped photographs of women. 53% of the time, it auto completed with a woman in a bikini or a low-cut top.  

In a more worrying example of machine-learning bias, A man in Michigan was arrested and held for 30 hours after a false positive facial recognition match. Facial recognition software has been found to be mostly accurate for white males but, for other demographics, it is woefully inadequate.  

Starring Cary Grant and Katherine Hepburn, Bringing up Baby follows a palaeontologist through his adventures with a scatter-brained heiress… and a leopard called Baby.

Where it all goes wrong:

These issues arise because of one simple problem, garbage in, garbage outMachine learning engines take mountains of previously collected data, and trawl through them to identify patterns and trends. They then use those patterns to predict or categorise new data. However, feed an AI biased data, and they’ll spit out a biased response.

An easy way to understand this is to imagine you take German lessons twice a week and French lessons every other month. Should someone talk to you in German, there’s a good chance you’ll understand, and be able to form a sensible reply. However, should someone ask you a question in French, you’re a lot less likely to understand, and your answer is more likely to be wrong. Facial recognition algorithms are often taught with a white leaning dataset. The lack of diversity means that when the algorithm comes across data from another demographic, it can’t make an accurate prediction.  

Coming back to image generation, the reality of the internet is that images of men are a lot more likely to be ‘safe for work’ than those of women. Feed that to an AI, and it’s easy to see how it would assume women just don’t like clothes.  

AI in Applications:

While there’s no denying that being wrongfully arrested would have quite an impact on your life, it’s not something you see every day. However, most people will experience the job application process. Algorithms are shaking things up here too.  

Back in 2018, Reuters reported that Amazon’s machine learning specialists scrapped their recruiting engine project. Designed to rank hundreds of applications and spit out the top five or so applicants, the engine was trained to detect patterns in résumés from the previous ten years.  

In an industry dominated by men, most résumés came from male applicants. Amazon’s algorithm therefore copied the pattern, learning to lower ratings of CVs including the word “women’s”. Should someone mention they captain a women’s debating team, or play on a women’s football team, their resume would automatically be downgraded. Amazon ultimately ended the project, but individuals within the company have stated that Amazon recruiters did look at the generated recommendations when hiring new staff 

Image of white robotic hand pointing at a polaroid of a man in a suit, with two other polaroids to the left and one to the right. The robot is selecting the individual in the picture they are pointing at.

Algorithms are already in use for recruitment. Some sift through CVs looking for keywords. Others analyse facial expressions and mannerisms during interviews.

Protection from Automated Processing:

Amazon’s experimental engine clearly illustrated how automated decision making can drastically affect the rights and freedoms of individuals. It’s why the GDPR includes specific safeguards against automated decision-making.  

Article 22 states that (apart from a few exceptions), an individual has the right not to be subject to a decision based solely on automated processing. Individuals have the right to obtain human intervention, should they contest the decision made, and in most cases an individual’s explicit consent should be gathered before using any automated decision making.  

This is becoming increasingly important to remember as technology continues to advance. Amazon’s experiment may have fallen through, but there are still AI-powered hiring products on the market. Companies such as Modern Hire and Hirevue provide interview analysis software, automatically generating ratings based on an applicant’s facial expressions and mannerisms. Depending on the datasets these products were trained on, these machines may also be brimming with biases.  

As Data Controllers, we must keep assessing the data protection impact of every product and every process. Talking to wired.co.ukIvana Bartoletti (Technical Director–Privacy at consultancy firm Deloitte) stated that she believed the current Covid-19 pandemic will push employers to implement AI based recruitment processes at “rocket speed”, and that these automated decisions can “lock people out of jobs”.

Battling Bias:

We live in a world where conscious and unconscious bias affects the lives and chances of many individuals. If we teach AI systems based on the world we have now, it’s little wonder that the results end up the same. With the mystique of a computer generated answer, people are less likely to question it. 

As sci-fi fantasy meets workplace reality (and it’s going to reach recruitment in schools and colleges first) it is our job to build in safeguards and protections. Building in a Human based check, informing data subjects, and completing Data Protection Impact Assessments are all tools to protect rights and freedoms in the battle against biased AI.  

Heavy stuff. It seems only right to finish with a machine learning joke: 

A machine learning algorithm walks into a bar… 

The bartender asks, “What will you have?” 

The  algorithm immediately responds, “What’s everyone else having?” 

 

The technologies used to process person data are becoming more sophisticated all the time.

This is the first article of an occasional series where we will examine the impact of emerging technology on Data Protection. Next time, we’ll be looking at new technologies in the area of remote learning.

(and can be costly too!)

 

GDPR is not normally associated with parties, but recently I heard the end of a conversation about an office Christmas party and it set me thinking about the impact that a misplaced sentence can have. Friendships and working relationships can be badly damaged, in some cases, irreparable.

If I choose to pass on my unvarnished opinion about a colleague during the Christmas bash, then I can find myself in a lot of trouble. If on the other hand, I whisper information that has come from the data controller then not only am I in hot water, but I’ve also given the extra present of a data breach.

Paragraph 4, Article 32 of the GDPR says:

“The controller and processor shall take steps to ensure that any natural person acting under the authority of the controller or the processor who has access to personal data does not process them except on instructions from the controller, unless he or she is required to do so by Union or Member State law.”

Put more simply, you must ensure that people are given clear guidance about what they can and can’t do with personal data and you must ensure they stick to those rules.

Bear in mind that it doesn’t matter how information is disclosed for it to be a breach. Whether you’ve been hacked, sent an email to the wrong person, lost a paper file or repeated information to someone who shouldn’t know it, a breach has occurred.

With verbal disclosure the situation is often made worse by the fact that our natural desire is to share more ‘interesting’ information, which is also usually more confidential and leads to greater upset.

We’ve seen examples where incidents have been dealt with from a disciplinary standpoint but have gone unrecognised as a data breach. Obviously, if you need to report the breach to the ICO, you’ll have to explain why you missed the 72-hour deadline for reporting. It is difficult to say that you have a sound regime for data protection but missed this high-profile target.

What steps should you take to avoid these issues:

Training
  • All your staff need to know about the risks of verbal disclosure. Include it in your normal GDPR training but you may need to provide a special briefing. As well as knowing that they need to notify your DPO or GDPR lead, it’s a great time to remind people of the perils of letting information slip.
Easy reporting
  • Take away any barriers that prevent staff from alerting you to an issue. Have an email address just for staff to alert you of issues or consider an online form.
A response procedure
  • If people do report issues then you need to have a well-established procedure to deal with them. Get it recorded and you can even practice to make sure the 72-hour deadline can be met.
Joined up processes
  • Issues which trigger disciplinary procedures may relate to data protection issues and vice-versa. Make sure that there is a section in the guidance for both areas that highlights the risks and include this in your general training and particularly induction training.

So, as you contemplate the upcoming festivities, it may be worth a timely reminder to everyone that we have to consider what we’re saying just as much a what goes into an email.