Last week, a news article popped up. “UK to overhaul privacy rules”. Naturally, it piqued some interest. The UK debated revoking the GDPR before they finished implementing it. By the time the GDPR came into effect in 2018, several years had passed since the UK announced its intention to leave the EU.

However, we have now officially left the EU and completed the transition period. We are no longer obligated to comply with EU regulations. This doesn’t mean the GDPR has disappeared,  but it does change some of the rules of the game.

 

We’ve Left The EU. Why Do We Still Have The GDPR?

As of January 2021, the Brexit transition period has ended. EU laws no longer apply to the UK. Therefore, the EU GDPR technically no longer applies to the UK.

The key word here is “technically”. According to the ICO, “The GDPR is retained in domestic law as the UK GDPR”. The UK GDPR writes the same privacy rules into UK law, with just a few minor changes. These laws remain regardless of our status within the EU.

The key difference now we have left the EU, is that we are no longer obligated to have a privacy framework identical to the GDPR, and the UK government now keeps the framework under review themselves. They also now have the power to scrap the whole thing.

What Will Change: International Data Transfers

While it is now technically possible, it’s unlikely the GDPR is going away any time soon. However, there may be changes that alter how schools and colleges handle their data.

Firstly, updates to current international adequacy agreements would affect bringing on new suppliers.

As part of choosing a data processor, schools and colleges must assess whether personal data leaves the UK.  If a company holds data in a country where data protection laws are not as stringent as ours, this puts undue risk on that data. These data processors should implement additional protections for personal data such as Standard Contractual Clauses.

Adequacy agreements negate the need for these additional protections. The UK has agreed with multiple countries that any data moving from the UK to those countries will be treated with the same security as if it were still in the UK. Upon leaving the EU, the UK government has announced plans to strike new adequacy agreements with additional countries. As the UK begins discussions with various nations, and adequacy agreements shift and change, the suitability of different data processors will also vary. Schools and colleges will need to keep an eye on policy changes regarding international data transfers, to make sure they have all the appropriate safeguards in place. should the UK remove a country from the list of “adequate” countries, schools and colleges should either terminate any contracts with data processors there, or ensure there are additional measures to protect any transferred data.

What Will Change: The Bigger Picture

Secondly, the UK government have stated that they intend to “improve the UK’s data protection regime to make it even more ambitious and innovation friendly.” To break this down into layman’s terms, the UK government want to ease the path to data sharing where they believe it could fuel growth for the country. It’s not entirely clear what this will mean in terms of specific changes to the current privacy rules. However, the UK are unlikely to entirely revoke the GDPR.

The UK already has adequacy agreements with over 30 countries, via the EU’s GDPR framework. If the UK were to massively reduce the data protection regulations, they would no longer be in line with the GDPR, and we would lose our adequacy agreements with these EU countries. Seeing as the UK government are keen to increase the number of adequacy agreements and data partnerships, it feels unlikely they’ll sacrifice agreements with the entire European Union, in order to relax data protection regulations.

Is the GDPR Here to Stay?

It certainly looks that way. The rumoured “overhaul” is currently more of a renovation. The GDPR will still be providing a framework for data protection and providing individuals with important data related rights.

So, as we start a new year of education and innovation, lets keep data protection at the core of it all. Responsible use of data builds trust between your organisation and your staff, students, clients and customers. It’s worth keeping that confidence. Without trust, it’s far harder to flourish.

It seems we are slowly marching towards “freedom day”, when the Prime Minister announces the removal of the restrictions that have covered almost every aspect of our day-to-day. However, Covid cases are on the rise again. With numbers as high as they were in Autumn 2020, We are all hoping the link between rising case numbers and rising deaths has been broken, but we won’t know for sure until we start to resume a normal life. The proof is in the pudding, or rather, it’s in the vaccine.

The vaccine is also the focus of today’s blog. Unless you’ve an (understandable) aversion to news channels, or you’ve taken up residence under a rock, you’ll definitely have heard of “Vaccine Passports”. This is the idea that those individuals now fully vaccinated, will be able to travel freely and attend large gatherings and events unhampered by coronavirus restrictions. The concept has been floating around since before vaccines were even available. Vaccine certification has already been used to run trial events in England.

There are concerns around all aspects of vaccine passes, from logistics to privacy and discrimination. However, for school and college staff who are too busy to go abroad or attend concerts, it’s not currently the greatest concern. Nonetheless, the controversy of the scheme has raised a question that is relevant to education institutions. “Can I ask if my employee is vaccinated?”

Can I Collect Data About Employee Vaccinations?

The first question here, is “what are you trying to achieve?”. To put it simply, why do you want to collect this data?. When collecting personal data, you need to consider two things. Is using this data necessary, and is it relevant for a specific purpose?

This is something you should consider when collecting any personal data, but it’s particularly important with data such as vaccination status. Medical data is special category data. It is afforded additional protections, and you need to provide strong grounds for using it..

So, why would you need to collect employee vaccination status? There are a few scenarios where collecting this data might be justifiable. For example, if employees work amongst clinically vulnerable individuals. Additionally, if employees are working in a health or social care setting, where they are more likely to become infected with Covid-19, it might be prudent to record their vaccination status.

Unless your organisation runs differently to the average school or college, for instance you provide specialist provision or you work with clinically vulnerable students, it’s unlikely you’ll have compelling reasons to collect vaccination data. If you are collecting this information simply for monitoring purposes, you’ll likely struggle to find adequate justification.

Justifying the Processing of Vaccination Data

You’ve concluded that vaccine data is necessary and relevant to the running of your organisation, now you need to select a legal basis of processing. If this data is integral to running the school or college, you should probably use public task as your legal basis. If you run an independent school, you would use legitimate interest.

As said before, this can only really be justified if you have a good reason to collect data about employee’s vaccinations. Additionally, you’ll need to identify an article 9 condition for processing, as Vaccination status is health data. You could consider using either Employment Law (Art. 9(b)) or Public Health (Art. 9(i)).

If you are using the public health exemption in article 9, you must take additional steps, either by ensuring a health professional carries out processing, or by explaining to employees that vaccination data will be treated as confidential and only shared in defined circumstances.

Can We Collect Vaccine Data if We Ask for Consent?

Your Organisation could possibly request vaccination details on the basis of consent. Employees could consent to share their data. However, employees would need to provide explicit consent due to the nature of the data. Additionally, consent is a poor choice of legal basis for data processing in the workplace.

Freely given consent is possible in theory, but does not allow for the inherent imbalance of power in the workplace. A key aspect of consent is the placement of power in the hands of the data subject. Consent should be free of sanctions or perceived sanctions. It’s unlikely an employee will feel they can refuse consent without sanctions at a later date. Employers have power over wages, promotions and job security. That power means that any consent given by an employee is unlikely to be entirely freely given.

As such, consent is an inappropriate lawful basis of processing for most personal data in the workplace.

Should we be processing vaccine information?

The answer may well be no. The pandemic has meant the sharing of information has been vital. Test and trace only works if test results are shared. This doesn’t mean there’s a free-for-all on all things “Corona”. As a general rule, we should only be collecting information we need to collect.

What’s the key take-away?

The collection and use of vaccinations data is a great way to look at any consideration of processing personal data. It comes down to some simple questions:

  • Do we know exactly what data we want to process?
  • Is it necessary, or a legal obligation?
  • Do we have a legal basis to justify processing the data
  • Can we provide appropriate protection for the data?
  • Can we provide suitable control for the individuals over their data?

If you can’t answer all these questions with a solid ‘Yes’ then the chances are you should not be processing this data.

For today’s post, we’re taking a quick dive into the murky depths of Subject Access Requests.

Imagine this scenario. One of your students, staff members, or anyone you might have information about is standing at the front desk, and they’re asking for all of their data. What do you do? What happens next? For many organisations, it’s a daunting thought.

In reality, answering a Subject Access Request can be reasonably simple. However, the process is easier to understand with a little background knowledge:

What is a Subject Access Request?

 

Every individual is entitled to know what information you are holding about them. Individuals have a range of rights, including the right to rectification and the right to be forgotten, but the right most commonly exercised is the right to access.

Individuals exercise this right by submitting a Subject Access Request. Through this process, individuals can ask you to provide them with a copy of the personal data you hold about them.

An individual has been able to submit a Subject Access Request (SAR) since before the GDPR. Although the process was slightly different, you could request your data under the 1998 Data Protection Act. The GDPR has simplified the process for individuals, so they can exercise their rights with more ease.

Receiving a Subject Access Request

Back to the situation at hand, you’re manning the front desk and you’re face-to-face with a SAR. What comes next?

The first thing to note is that a Subject Access Request can come to anyone within an organisation, and can come in any format. Your organisation may have a specified route for Subject Access Requests such as an email address or phone number, but individuals are not obligated to use these methods.

A Subject Access Request can be verbal or written and can be sent through mechanisms like social media. You can receive a Subject Access Request via tweet. When you receive a request via one of these routes, it’s best to make a written record of the request, but you can’t force the individual to do so.

An individual can send in a Subject Access Request as an instant message like this.

In this scenario, the first thing you’ll want to do is gather a little more information. You can ask the individual for their name, and whether they have any other information such as a reference number to help locate their records.

After that, it’s best to enquire if there is anything specific the individual is looking for. Sometimes an individual is looking for specific information, or data from a specific time-period. They’re under no obligation to narrow the scope of their request, but it can sometimes save them from sifting through hundreds of records and save time for your organisation.

 

A Note About Identification

It is important that the person making the request is who they claim to be. This process requires some common sense. Identity verification must not be used to block access to information. In the case of requestors such as students or staff, you already hold information about their identity, and their presence at your organisation on a daily basis means that further checking is not required.

When requestors are less well known, verification of identity is required. This is easiest to do in person. The requestor can show identification documents such as:

  • Passport
  • Current Drivers Licence
  • Utility Bill
  • Bank or Building Society Statement
  • Letters from the Benefits Agency
  • Letter from Professional

In the scenario used here, you might not be absolutely certain of the requestor’s identity. You could ask to see identification when they first make the request. If they didn’t have any, you could advise them to return with ID when possible, but that you’ll start processing their request immediately.

Documents like passports, driver’s licenses and utility bills can be used to verify someone’s identity.

 

Moving down the chain

Once you’ve taken this information, the next step depends on how your organisation works. For larger organisations, you might have a designated data protection team. For smaller organisations, you may have a single point of contact, or you might be the data protection lead. In particularly small organisations, it’s not unusual for employees to have more than one hat to wear.

Regardless of size, your next step will involve recording the request, and setting the data discovery process in motion. Your organisation has a calendar month to respond to a Subject Access Request, and this begins when anyone in the organisation receives a request, not when the designated data protection lead receives it, so it’s important you get the ball rolling quickly.

You should have somewhere to record any Subject Access Requests your organisation receives. It might be an cloud-based system like ours. You should record the request, along with the date requested and any additional details. If someone makes a written request or sends a tweet, you should record that request verbatim so there can’t be any confusion.

Data Discovery

The request has been recorded and passed on to the correct team. If you’re in a large organisation, that might be the last you hear of the request. If your organisation is smaller, you are more likely to be involved in data discovery.

At this stage you’ll need to look through all your records for any personal data of the requestor. If the scope has been narrowed down, you may not need to locate all the personal data for an individual. When searching for data, make sure to search using any identifiers that might be used in your organisation. If customers have a reference number, or if individuals are referred to by initials, these should be searched for too, as well as searching for the requestor’s name. It’s also worth noting that something can be personal data, even if none of these identifiers are used. For instance, sometimes a cursory glance through all relevant emails might turn up one or two results that were missed by a filtered search done previously.

This can be quite a complex process, and if you’re struggling to hit the deadline, it might be worth looking at extensions.

Redact and Return

The data has been found and it’s been collated. Now you need to provide it to the individual. In general, SARs are sent out in electronic format, but a general rule of thumb would be to respond to a request in the format you received it. In this case, the request was verbal, and it wouldn’t make sense to try and provide the response verbally. Instead, the data protection team are using the contact details you collected at the start of the process and will be emailing out the response. Good for the bees, good for the trees, and easily accessible to the requestor.

However, before this data can be sent out, it must be redacted. An individual only has the right to access their own data, so anybody else’s personal data needs to be removed.

There are a few different methods to redact information, either in hard copy or in digital format. A quick scribble in black pen seems to suffice in the movies, but it’s not a reliable form of redaction. To ensure you don’t give away data you shouldn’t, have a look into electronic redaction, which covers the data and then deletes it from underneath so it cannot be recovered.

An individual is only entitled to their own personal data. You need to redact or remove other people’s information.

The end of the line

A month has passed, and your student or staff member has received an email with the personal data they requested. They’re happy, you’re happy, and there have been no inadvertent data breaches.

Subject Access Requests might seem daunting to begin with, but they just require a little teamwork and some forward planning. With a step-by-step process you can solve a SAR with ease.

Here’s a good news story to kick off the new month…

 

The GDPR has provided a whole new framework for data protection, a framework that is centred around an individual’s right to privacy rather than an organisation’s desire for data. Your rights are now stronger and clearer, and organisations must safeguard data and be transparent about how they use it.

Individuals have benefited from tighter data security, greater control over their information and clearer requests for consent. However, the GDPR has also provided an additional benefit. GDPR is good for the environment!

What does GDPR have to do with the environment?

At first, it’s quite hard to see the connection. How can a data protection law affect climate change? This becomes clearer when we remember that the majority of data these days is held electronically. We use electricity to run computers, servers and routers, as well as to manufacture computing equipment. Each email sent creates around 5g of CO2. This may not sound like a lot, but when the average office worker sends and receives 140 emails a day, it all adds up.

So, using the internet has an environmental impact, but that still doesn’t explain why the GDPR has been good for the environment.

Infographic: Environmental Benefits of the GDPR. simple image of green hills with wind turbines. Underneath a green box text states "Data Protection and Environmental Protection may seem completely unrelated, but they are linked through Electricity. We power our digital world through electricity, often generated using fossil fuels. So, every time we process digital data, we’re adding to our carbon footprint. The GDPR is helping us reduce CO2 production.".

 

We’re receiving fewer emails

Since the GDPR came into effect, organisations have reduced the number of marketing emails they send. The GDPR made consent requirements are more stringent; organisations must provide an opt-in mechanism for emails, rather than an opt-out mechanism. This means that only those interested in a product or service receive marketing emails. Not only is this better for individuals, but it also cuts down emissions. A reduction in the number of emails sent means that over 360 tonnes of COare saved every day. The same as a flight from London to New York

Websites are Slimming Down

Additionally, many websites have cut the number of third-party cookies and ad trackers on their websites. A report from Jet Global found that since the implementation of the GDPR, UK news sites have 45% fewer third-party cookies on their sites. UK companies aren’t the only ones to slim down. When the GDPR came into effect, USA Today ran a cut-down version of their site for EU customers. This version (with all the tracking code removed), was one-tenth of the size of the original size, and took just 3 seconds to load, compared to 45 seconds. Slow sites use more energy, and most of this energy is provided by fossil fuels and non-renewables. Digital sustainability expert Chris Adams estimated that the cut-down version of USA Today can save about as much CO2 as a flight from Chicago to New York a day.

 

The GDPR is good for individuals, and is good for businesses. Turns out, it’s also good for the planet.

 

In schools, responsibility surrounding children’s mental health and wellbeing is clearly documented. Legislation such as the 1989 Children’s Act, and guidance such as “Keeping Children Safe in Education,” set out responsibilities of staff and governors. Furthermore, it is made abundantly clear that data protection concerns should not prevent action from being taken to support the welfare of the child or young person.

However, these requirements only apply to those under 18. In the Keeping Safe in Education documentation, all individuals under the age of 18 are classed as children. Yet in higher and further education, most students are adults. This makes management of their welfare a more complicated puzzle.

 

The Statistics: Mental Health in Universities

In a 2020 survey of 1800 university students, Randstad (A human resources consulting firm) found nearly 40% of participants felt their studies were affecting their mental wellbeing. Of students considering leaving their course, 55% of them considered mental health decline a leading factor in their decision. There is plenty of evidence that attending university can create stress for students, and exacerbate mental health problems.

There is also plenty of evidence that early intervention supports recovery from mental health problems, but this can only happen if those who can intervene know that it’s needed.

One obvious source of support would be parents, or another trusted adult. Many parents want to provide that support, but if the student won’t ask for help, does the university or college have the right to get in touch and flag the concern?

To disclose, or not to disclose?

Let’s think about this from the pure approach of data protection. A data controller is in possession of special category personal data (concerns about the mental wellbeing) for an adult. Without explicit consent or a statutory duty, can the data controller disclose that personal data to a third party?

The obvious answer to this question is no. It would be a clear breach of confidentiality.

The issue is, from the perspective of the College or University, they can only go so far in providing welfare services, so surely sharing concerns with parents/guardians would be justified in the best interests of the student?

This is where everything becomes complicated. At a point of crisis, where an individual’s life is at risk, the Vital Interests lawful basis could allow an emergency contact to be made, but only to alert the contact of the emergency. Universities can only disclose additional detail if the individual is incapable of giving consent.

If mental health intervention is most effective before an emergency situation, we can’t disclose health data on the basis of Vital Interests.  We also have to consider that the student may have good reasons for not wanting their parents to be contacted, even in an emergency. It’s possible the issues may relate directly to the parents or to their attitudes towards the student. In some cases, informing parents could put the student at risk of serious harm.

How can the institution know the family background of the thousands of students they deal with? Worse still, if parents were contacted when a concern was flagged, then this might put students off coming forward in the first place.

 

Creating a new infrastructure

Information cannot be disclosed without consent. For a person with declining mental health, getting that consent can be difficult. Depending on the particular problem being faced, that consent may not even be considered freely given.

As such, the best time to address this problem is during the process of application and enrolment. A structured opt-in scheme with very clear rules could be established at universities, so students can give their consent for concerns to be shared with a named contact. This scheme could be promoted in school during applications, and by universities during open days.

This wouldn’t be used if students miss a lecture or don’t hand work in on time. University is not school. While most students are on the younger side, they should be treated with the same respect given to any adult. However, for serious concerns about a student’s wellbeing, this opt-in mechanism would enable help to be sought before a crisis. Should a student not consent or withdraw their consent later, the university or college would follow the normal welfare process, giving students freedom of choice. For those who do consent, this scheme would act as an additional safety net, giving students and parents a level of reassurance.

 

A Widespread Solution

Opt-in systems are appearing at various universities, but it’s unclear whether these will be adopted more widely. UCAS, who manage most university applications, currently have nothing in place in their application process. Furthermore, the Office for Students say that “individual universities are responsible for developing their own mental health policies.”

As such, it’s unlikely there will be a standardised process across all colleges and universities. However, even if the Office for Students does not mandate an opt-in system, Universities and Colleges could implement them independently.

This is the most obvious method of balancing the issues of consent and concern. Yet, a recent Freedom of Information Request found that of 149 responses, only 32 higher education institutions had a system for students to opt-in to parent/guardian contact.

 

Finding the Perfect Balance

From a broader perspective, this debate raises questions about where duty of care lies at university. From a data protection perspective, it’s relatively simple. An individual’s right to consent or object to processing/sharing of data should be respected.

A consent based system gives students the choice they are entitled to. However, it also means that a student’s support system can be contacted if they are in crisis. 96% of students at the University of Bristol opted in to their welfare scheme, and the university used the scheme 36 times in the first year. By stepping back, and adding additional infrastructure designed with data privacy in mind, higher education institutions can keep both data protection and welfare as a priority.

 

We’ve reached a new checkpoint in Boris’s Covid roadmap. Yesterday, non-essential shops reopened, and many flocked to their local pub to enjoy a pint outside. For many, yesterday also marked their first day back in the office. While teaching staff have been back for a few weeks now, for others the full time return to the office is just happening. While returning to the office is cause for celebration, we should also take the opportunity to renew our data protection vigilance. 

Many organisations have used lockdown as an opportunity to reshuffle the office. Speaking from experience, it can be a little jarring when you come back to work and find your desk in a whole new place! 

 

Separating Work and Home Life

 

It’s important to acknowledge that changes like this can leave people a little uncertain, and that working away from the office can lead to lapses in data protection practices. Now is a good time to check staff still have the little things in place, such as locking computers when they leave their desk., It’s important to remember that not all information is suitable for sharing with colleagues, particularly if you’re in an office with people from different teams. 

Now is also a good time to check your digital distancing. Lockdown compressed our work lives and personal lives into a single space. Now, much like social distancing, we need to make sure we leave enough space between those two lives, that data can’t cross from one to the other. While working from home, you may have had to use your own devices, or store things in personal drawers in your house. Now we have access to the office again, take a few moments to assess your home workspace. Have you left any paperwork at home? Do you still have access to work emails and files on your home computer? 

Another common occurrence in lockdown has been the use of personal mobiles for business related tasks. We’ve talked previously about some of the dangers of this, but for many, it has seemed like the only choice to keep organisations running smoothly. Now we can return to our usual methods of communication, it might be best to close down any work WhatsApp or Messenger groups, as they greatly increase the risk of a data breach. 

Moving Forwards Safely

 

 It’s wonderful to be back in the office, and it’s wonderful to see colleagues face to face, albeit from a safe distance. It might take a little while for things to feel normal again, but hopefully this is a permanent step towards ‘Business as Usual’.  As we move forward along the Covid roadmap, we can start getting excited for holidays and weddings, as well as indoor sport and museums. However, we should also keep a careful eye on our behaviour, so our excitement isn’t dampened by a data breach and its consequences. 

And on that note, may we move cautiously, but optimistically, to May 17th, the next step in our journey to normality.   

 

 

When you purchase a product or use a service, at some point you will probably receive a feedback form. It’s almost an inevitability.

It might be a form that arrives on email, or an irritating pop-up in an app. Recently, if you use a smart speaker you may get a notification which proceeds to tell you “Two months ago, you bought cat food, how many stars would you give this product?” It’s easy to answer the question. Although, depending on how irritating the distraction is, the validity of the feedback is questionable!

Whenever these pop-ups appear, you’re told “Your responses will remain anonymous”. It’s such a common appearance that most of us probably don’t even notice. With the smart speakers, there is no privacy information at all. We all assume our feedback is anonymous. Maybe it’s worth taking a step back and asking ourselves “What is anonymisation anyway?”

 

What is Anonymisation?

Anonymous data is defined in recital 26 of the GDPR as “Information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”.

Anonymous data is, therefore, not subject to the provisions of the UK GDPR. However, anonymisation is not as simple as removing names and addresses, particularly with the new definition of personal data. The UK GDPR defines personal data as data relating to an identified or identifiable natural person.

 

Understanding Identifiability

To understand the breadth of identifiability, let’s look at a stock image.

picture of commuters waiting for train. Long exposure picture, commuters and train arriving are both blurred.

A stock image of london commuters. This photo is too blurry to easily identify the individuals in the picture.

This image is from Adobe Stock, a website where you pay a licence fee to use images in commercial works. If I hadn’t told you where the image came from, you could find out quickly with a reverse image search. We’re not going to get into the debate here about invasion of privacy by the photographer and whether publishing the image for sale, truly puts into the public domain.

If we go on the Stock image website, we’ll find the name of the photographer who took the picture. We could then contact the photographer and ask about this particular photo. The photographer might say that it was a candid photo, without any models, but that they took it at 6.40am on the 5th October 2020.

You could then canvas around this station at the same time of day the photographer took the picture. Given how many people use the tube for their daily commute, there’s a distinct possibility you’ll find some of the people in the photograph.

In three or four steps, you can identify the individuals in the photograph. The individuals are identifiable, so this picture could be defined as personal data. It’s easy to see from here why anonymisation is a harder task than it used to be.

 

True Anonymisation

So, are our responses to those rating questions anonymous? The answer to that question is “maybe.”

If the data is requested and collected in a way that provides the rating to the company with no other details, then we could say the feedback was anonymous.

However, lets take an experience that many of us are familiar with. You download an app on your phone and happily set about completing puzzles, building civilisations or destroying aliens. After a while a request pops up asking for a review.

For Apple users this is all provided by the Appstore. Interestingly, an application provider must only request this information three times year. The application provider must therefore record how many times they’ve sent a pop-up notification. So, it’s clear they must store some personal data.

Seeing as the App Store handles ratings and reviews, you could consider Apple as a data processor, running the review process on behalf of the App developer. So, maybe they are processing personal data after all.

Let’s think about a simpler example. You run an event in a school and you ask for feedback afterwards. Let’s say you send out a link to a Google Form and someone answered about the lack of wheelchair access, or rapidly flashing lights without a warning. If you have one person who uses a wheelchair or one person with photo-sensitive epilepsy then the anonymity of the feedback is very much weakened.

 

Managing Anonymisation

The bigger question is ‘Do you actually need to have perfectly anonymous data?”

For the education sector, feedback is essential to improve teaching, educational resources and student wellbeing. Educational organisations often need to show their commitment to progress and equality. The publication of statistical data can support that.

The UK Data Service provides advice on anonymising both Quantitative Data (numbers and statistics) and Qualitative Data (opinions, statements and written responses).

However, if you take sensible anonymisation measures (or use sensible alternatives such as pseudonymisation measures) and you protect the data you gather as personal data, any risks can be cut substantially, and you can get on with driving improvements based on the results of your feedback.

 

 

Last week, a global hacking campaign targeted Microsoft Exchange servers, and compromised hundreds of UK companies. It was estimated that more than 500 email servers in the UK were hacked, alongside many more across the world. Attackers used newly discovered vulnerabilities in the software to gain access to data, or to install ransomware.

Ransomware can cripple an organisation, with hackers locking the organisation out of their own servers and removing access to data unless the organisation hands over a hefty fee. Attackers often delete or sell the data they held hostage, even if the victim pays the ransom. We’ve talked about the damaging impact ransomware can have on operations in previous posts, such as the Travelex incident in early 2020. The company spent several weeks unable to function, with all of their systems offline.  In short, a ransomware attack can bring an organisation to its knees.

A ‘Zero-Day’ Hack With Widespread Damage

The recent hack has been particularly damaging due to multiple factors:

Firstly, thousands of organisations use Microsoft Exchange. These range in size from large corporations like Metro and the Independent, to individual schools with a handful of students. Smaller organisations may not have dedicated IT staff, and are less likely to spot growing problems, or may miss a patch, which removes a vulnerability that could later be exploited. When an attack compromises a widely used software, small organisations often receive the most disruption.

The second factor in this hack is the type of vulnerability that was exploited. According to Microsoft, hackers used new techniques, that have not been seen before. This meant that attackers knew of vulnerabilities in the Microsoft Exchange software before the software developers knew. This is referred to as a “zero-day” vulnerability. The developers have “zero days” to fix the problem that has just been exposed — and perhaps already exploited by hackers. Software vendors must work to quickly release a patch while the world waits, and customers are at risk. If developers fail to release a patch before hackers exploit the security hole, the “zero-day” vulnerability becomes a “zero-day” attack.

Preventing Zero-Day Attacks:

While these attacks can lead to personal data breaches. Zero-Day attacks are a broader Cyber-Security issue. In organisations such as schools and colleges the two issues overlap; most of the data held on systems such as Microsoft Exchange will be personal data.

Having a specialist on-call should you run into a problem might be worth considering. Some insurance policies can provide access to this type of expertise. More complex preventative measures require a more detailed understanding of IT, but there are still some more simple things that you can put in place to reduce risk.

  1. Ensure you have Firewalls and Anti-Virus software in place, and you update the software regularly.
  2. Make sure to install any new patches or updates released for your software. These patches are likely to be securing vulnerabilities in the software.
  3. Keep an eye on the news. If a software you use appears as part of a hack or cyber-attack, letting IT staff know as soon as possible gives them a head start to tackle any issues that arise.
  4. Ensure your organisation has a secure backup in place, and that you hold the backup separately to your main servers. Should hackers delete your records, you may be able to retrieve lost data from your backup.

Disaster Recovery and Workforce Education

These are just a few ideas as to how to keep your organisation safe from cyber-attacks. However, you can’t prevent every single attack. The nature of Zero-Day attacks mean that you don’t know about a vulnerability until after an attack. Therefore, having a disaster recovery plan is useful, should you need to deal with such a situation.

A final point. In this post we’ve explored some of the aspects of a personal data breach caused by a cyber-attack, rather than human error. Chances are, the majority of data breaches you encounter will be caused by human error. The preventative measures discussed above are important for reducing risk of a cyber-attack, but you should combine them with workforce education and a strong data protection ethos. A breach caused by an individual can have just as damaging of an effect as one caused by code.

 

Sentry has been designed for schools

Last Month, The U.K. Commissioner for Public Appointments posted an advertisement for a new Information Commissioner. Current Commissioner Elizabeth Denham announced previously that she was leaving her post in Octoberhaving overseen the UK’s transition to new data protection laws.  

Whoever is hired will be stepping into quite a sizable pair of shoes. Data protection complaints doubled in 2018/19, from around 21,000 to over 40,000 complaints. There were slightly fewer complaints in 2019/20, but the number is still far higher than before the implementation of GDPR. This is not necessarily a bad thing; the rise in numbers is partially due to more stringent rules, but it has also come from increased public awareness around data protection. Individuals are learning to be a bit savvier about their data, and they’re learning where to go when they feel their data is being misused. 

Awareness is important in current times, as we are producing personal data almost all the time. All our time on electronics, our use of smart technology, and even the signing of a humble visitor book. It all creates data. In 2025, we’re due to hit a total of 175 zettabytes of data in the global datasphere. A single zettabyte is around a trillion gigabytes. Not all of that is personal data, but the numbers are still rather overwhelming to think about. Increasing amounts of data, and increased pressure on the Information Commissioner’s Office (ICO), must make the job of Information Commissioner rather daunting to apply for.   

 

What is an Information Commissioner?

The information commissioner is the head of the ICOFor the most part, they act as the public head of the body, and lead the organisation through its strategic development. The Information Commissioner looks at the big picture, while those employed by the ICO manage the day-to-day duties of the office.  

In the government’s advertisement for the role, they describe some of the duties of the ICO: 

  • Give advice to members of the public about their information rights; 
  • Give guidance to organisations about their obligations with respect to information rights; 
  • Help and advise businesses on how to comply with data protection laws;  
  • Gather and deal with concerns raised by members of the public; 
  • Supports the responsible use of data;  
  • Take action to improve the information rights practices of organisations; and 
  • Co-operate with international partners, including other data protection authorities. 

 

To a Data Protection Officer (DPO), this role description might sound familiar. Most Data Protection Officers work within a smaller remit, and unfortunately there’s rarely an opportunity for an international visit. However, many of the same principles apply. Data Protection Officers need a good memory and a keen eye for relevant legislation. In schools and colleges, they also need strong problem-solving skills. In an environment with a lot of personal data, and quite a few wildcards, a DPO needs to be prepared for a brand-new scenario every day. Taking a piece of legislation (written largely for businesses) and translating it into practical solutions for the education sector can take some talent. Maybe the public appointments office needs to have a look round schools for their next candidate.  

In the second instalment of our Emerging Tech series, we look at the development of commercial genetic testing, and the data protection implications of widespread genetic screening. 

 

“Customers who are genetically similar to you consume 60mg more caffeine a day than average.” 

“You are not likely to be a sprinter/power athlete” 

“Customers like you are more likely to be lactose intolerant” 

“You are less likely to be a deep sleeper” 

These are all reports you can get from commercial genetic testing. Companies such as 23 and me, Ancestry.com, MyHeritageDNAfitWe’ve talked about the rise of genetic testing before, but recent announcements from Richard Branson have brought the topic back into discussion. 

Earlier this month Richard Branson announced he was investing in 23 and Me, and the company would be going public (meaning shares will be traded on the New York Stock Exchange). This push for growth and investment has reopened the proverbial can of worms, and people are once again considering the privacy implications of genetic testing. 

What is genetic testing?

Genetic testing takes a DNA sample, such as hair or saliva, and identifies variations in your genetic code. These variants can increase or decrease your risk of developing certain conditions. These tests can also identify variations in ‘junk DNA’ that have no impact on your life, but can be used to identify relatives and ancestors. 

Genetic screening first appeared in the 1950s. Researchers later developed more detailed DNA profiling in the 1980s, used for crime scene investigation. Technology has come forward in leaps and bounds since then. Once an expensive and costly feat, you can buy a reasonably affordable testing kit in many pharmacies or online. In Estonia, the government are offering genetic testing to citizens; to screen for predisposal to certain conditions and help individuals act early with personalised lifestyle plans or preventative medication. 

There have been suggestions to utilise genetic screening in the Education sector as well. In 2006, two years before 23 and Me began offering their first testing kits, geneticists suggested schools as the perfect place to carry out widespread screening. Researchers have also investigated the possibility of genetically informed teaching, with teaching style tailored to an individual’s predisposition to certain learning styles. 

For those outside education, the biggest development has been Direct to Consumer (DTC) genetic testing. DTC testing began mostly as a tool for ancestry identification, now there are millions of consumers, and even companies offering tailor made nutrition plans designed around your genetics. 

I find myself writing this a lot, but it sounds like science fiction. Yet again, the science of today has caught up with the fiction of yesterday. However, if growing up surrounded by shelves of Sci-Fi has taught me anything, a cautious approach is often best. This is definitely true of Genetic testing. There are many possible advantages, but there are also risks. 

A Breach with Big Implications:

Data breaches are always a possibility when you entrust your information to someone else. However, genetic data is clearly a sensitive type of personal data, particularly if a customer has opted for genetic health screening. 

Companies will put swathes of protective measures in place, but in a world where a cyber-attack occurs approximately once every 39 seconds, there will be breaches. In fact, there already have been breaches. In July last year, hackers targeted the genetic database GED match, and then later used the information to target users of MyHeritage. Even without cyberattacks, breaches occur. When recovering from the recent hack, GEDmatch reset all user permissions. This opened up over a million genetic profiles to police forces, despite users opting out of visibility to law enforcement. 

If genetic testing is ever to be used in schools or offered nationwide, one key issue will be ensuring they hold that data securely. If schools and colleges offered genetically informed teaching, they would have to hold that data too. Adequate security measures for such information can be difficult to manage, particularly if education budgets stayed the same. Infrastructure would require radical change before genetic testing could ever be implemented safely. 

Breaches are nothing new, but with such precious data, they can be worrying. 

Secondary Functions and Sources of Discrimination:

Under the data protection act, data controllers must set out what they will use your personal data for. They cannot use that data for unrelated purposes without informing you. However, over recent years, there have been several cases where ambiguity over accessibility has made it to the news. 

Individuals can opt-in to share their data with 23 and Me research teams. Many customers were comfortable with researchers using their data for medical advances. It was not until their public deal with GlaxoSmithKline, that it was clear genetic data was being passed to pharmaceutical companies for profit. 

This data was anonymised, so the outcry following the announcement was more about ethics than data protection. However, there have been multiple cases where companies have allowed law enforcement to access their databases, despite stating otherwise in their privacy policy. 

Your genetic data reveals a huge amount about you and your characteristics, so it’s important to know exactly who can see it. For example, variations of the MAOA gene have been linked to levels of aggression, as well as conditions such as ADHD. Identification of these types of variants could help employers find individuals more likely to succeed in their field. However, it could just as easily lead to discrimination in hiring. Researchers have also linked other conditions such as bipolar disorder to certain genetic variants. Should that information be available to employers, it might lead to workplace discrimination. For example, bosses not promoting individuals they think might later become “unstable.” 

There has been speculation that biological data could be used for identifying terrorist subjects, tracking military personnel, or even rationing out treatment in overstretched health systems. This is all speculation. Even so, there are fears of discrimination based on the possibility of you developing a certain condition or trait. 

The Risk of Re-identification:

The speculation above works on the basis of genetic data being individually identifiable. Companies use anonymisation to reduce risks of such discrimination. Genetic companies go to great lengths to separate genetic data from identifiers. For instance, anonymising data for research purposes, or storing personal and contact details on a separate server to genetic data. The view has always been that if you separate personal identifiers from the raw genetic data, the individuals remain anonymous. 

Unfortunately, research has already shown that it is possible, in principle, to identify an individual’s genomic profile from large dataset of pooled data. It’s an interesting thought. Companies are often quite willing to share anonymised data for additional purposes. It is no longer personal data and isn’t protected with the same legal safeguards. If we can re-identify a data subject, it requires the same levels of security and legal protection as personal data. Dawn Barry, cofounder of genetic research company LunaDNA, said “we need to prepare for a future in which re-identification is possible”. 

If this data could be re-identified, it raises questions over the definition of anonymity. It also reignites the discussion over who Genetic Testing companies should be sharing data with. 

Understandable Worries? Or Needless Fear?

Schools and colleges have always been a proving ground for new technologies. It’s worth remembering that fingerprint scanning was being used in UK schools for over ten years before the Protection of Freedoms Act caught up in enforcing parental consent.  

It would be easy see than a “scientifically based, individualised learning experience” could be presented as an ideal way of helping all students achieve the best outcomes.  

InterestinglyDirect to Consumer genetic testing has now been available for just over a decade, so there is still plenty of room for development. However, we’re still some way from determining the dayto-day life of students in education. 

Here’s a sobering thought though. Should the worst happen, and something compromises your data, you can change your passwords, you can change your bank details. You can even change your appearance and your name. You can’t change your DNA. We’ve got to keep that in mind as the world of biometrics continues to grow. 

Next time, we’ll look at remote learning and the technologies that are being developed for the virtual classroom. Find previous posts from this series here.