The EU-US Privacy shield, a framework designed by the U.S. Department of Commerce, and the European commission, has been struck down by the European Court of Justice (ECJ).

The framework, approved by the EU in 2016, has been at the centre of several international discussions for the last few years. The program allowed companies to sign up and certify that their data protection practices were in line with EU law.

With the introduction of GDPR, the EU-US privacy shield acted as strong evidence that a company was holding data in a compliant manner, allowing organisations to justify using American firms, and sending data out of the EU to be processed.

However, European data protection law states that data should only be transferred out of the EU if there are appropriate safeguards in place. Due to differing laws on national security and law enforcement, the ECJ have now deemed that US domestic law does not provide appropriate safeguards for personal data, and that “Surveillance programs … [are] not limited to what is strictly necessary”

Therefore, the agreement between the European Union and the USA, has been struck down.

What does this mean?

In practical terms, this ruling could have a substantial impact on education settings. Most schools and colleges use various apps and games to facilitate learning, and many institutions will be using external software to manage HR and finance. Some of the apps and software used will be capturing personal data, hosted on servers within the US.

Previously, if a school or college wished to use a company hosting data outside of the EU. They could check whether the company was signed up to the Privacy Shield. The Shield provided organisations sufficient comfort that their data would be handled in a compliant manner. With that safeguard gone, it will be harder to prove that using a data processor from outside the EU is an appropriate action.

Moving Forward:

This is likely to affect a proportion, but not all, of your organisation’s suppliers. A list of companies signed onto the scheme can be found here, but here are a few examples used in the Education setting:

Mailchimp: The automated email company stores all data on servers in the US, including recipient and sender details.

Consilio: A firm supplying legal services and software, Consilio provides hosting, processing and analysis on data, and is hosted on servers in the US.

Egress: Egress provides secure email and file transfer for many organisations, including the NHS and many councils. However, Egress does have servers within the UK.

It’s worth noting that in some cases a supplier may have their main services based in the EU, but sub-processors or performance management functions are based in the US. Zoom is an example of this, but you can relax. They use Standard Contractual Clauses (SCCs) and not the Privacy Shield.

However, the access the US Government has to data hosted within the country is not prevented by the Standard Clauses, and the individual who brought this case to the ECJ has now taken aim at SSCs, so watch this space.

A review of your suppliers is now in order and you may have to make difficult decisions.

With more than 3,000 cases of the new coronavirus confirmed, Italy has announced that it will be shutting all schools for 10 days, to slow the spread of the disease. With cases beginning to increase in the UK, the possibility of similar action being taken here is also increasing.

Most students, teachers and lecturers are currently working on business as usual, but behind the scenes, admin staff are frantically devising plans for remote learning. As the spread of the coronavirus cannot be easily predicted, institutions need to be prepared to continue the working day from home at a moment’s notice. The difficulties of such an endeavour are more complex than you might expect.

An Unexpected Break:

In the event of mass school closure, few primary and secondary schools have the infrastructure to support remote learning. This is less of a worry for Higher Education, as many universities already have lecture streaming and recording facilities in place. They often also use virtual learning sites such as Moodle. Lots of University students already use their personal devices to access work, so while closures wouldn’t be ideal, they’d be manageable.

For schools, closures are much more likely to be an issue. Setting up a secure way to share lessons and resources online takes time. It often also takes money. For schools and their increasingly tighter budget, bespoke software and quick fixes are far out of reach. Various free or cheap file sharing sites such as SharePoint, Google Drive and Dropbox can make sending resources to students possible, but returning work and grades can easily become a messy affair.

Even if personal data is transferred between staff and students in a secure manner, it’s very difficult to control where it goes next. Most services offer the ability to download files onto your device, and personal devices are under far less scrutiny than those owned by schools and universities. SARS-Cov2 (the virus causing the disease), is not the only virus that threatens institutions. Should a teacher have a computer virus on their device, the data of their students could be compromised. Does this mean that schools need to provide malware protection for their employees? It’s another cost and another worry to add to the list.

Finding a Way for Face Time:

Assuming schools can find a secure solution to send work home, textbooks and tests aren’t a patch on a good teacher. File sharing can keep children learning long division at home, but what about ‘W plane transformations’ in A level Further Maths?

Many schools are trying to find a solution meaning teachers can still run lessons for their pupils. However, with little preparation time, schools are turning to use of personal accounts on media like WhatsApp and Skype. Anyone who has had any involvement in education, knows that this rings warning bells. Sharing personal accounts poses significant safeguarding risks. Regulating a private video call between staff and students is nigh on impossible. It would be the responsibility of the individual to record and report any issues, and it’s far too easy to say “I forgot to record the meeting”. Regardless of the safeguarding issues, should your call be recorded by the application you use, your personal data is being held by yet another data processor.

Increased Ratios. Increased Risk:

Should schools and universities remain open, Covid-19 could still cause problems. The Prime Minister has announced that relaxing rules on class sizes could be used to combat staff absences, should the virus take hold. While this could reduce issues with childcare and education, it’s likely to increase pressure on staff. If Students are moving class, their data is moving too. Teachers will not only be working under the increased pressure of cramped classrooms, but will be responsible for the safeguarding of more personal data than usual, and all of this whilst also being more stressed than usual. These are the ideal conditions for a serious data breach. Just working in a different classroom poses risks. When all this is over, no one will be completely sure where all the student and staff records have ended up.

A Necessary Compromise:

Every school, college and university across the country is desperately trying to find a balance. A situation where safeguarding is still prioritised, data protection is ensured, and students can still receive the quality of education they deserve. There is no simple answer, and there will be much relief when the crisis has passed.

For now, we will be closely watching the situation as it unfolds. Preparation is key, and according to Boris Johnson, a lathering of soap and singing Happy Birthday twice over, will save us all.

Last week, the University of East Anglia (UEA) paid out over £140,000 compensation to students affected by a 2017 data breach. An email containing information on personal issues, health problems and circumstances such as bereavement, was mistakenly sent to 300 UEA students. The email contained sensitive personal data of over 190 people. UEA reported that the email was mistakenly sent to a group email address, that autofilled when the sender started typing. This very simple mistake had a severe impact on hundreds of already vulnerable students.

This is an all too common example of how a simple slip can have a major impact. As Outsourced DPO for many schools and education institutions, we’ve seen just how often these mistakes are made. A misdirected email is one of the most frequent breaches logged by our customers. Although these breaches are prevalent, mismanagement of communication can have a devastating impact on an organisation.

What’s more, there are additional risks in Higher and Further education. Unlike trusts, student information is often held centrally, rather than in separate faculties or sites. Universities such as the University of London, The Open University and The University of Manchester, all have over 100,000 students on their enrolment list. When mistakes happen — and they will happen — a lot of people could be affected.

As this data breach occurred in 2017, the General Data Protection Regulations were not yet in effect. Should the Information Commissioner’s Office (ICO) have decided to fine the University, the amount would have been far lower than if the breach occurred today. However, The ICO decided that no punishment was required. Yet, while the regulatory consequences were low, the University was not absolved of their responsibility to the affected students.

As compensation for the damage from the breach, UEA paid a large sum to all affected students. The breach also damaged UEA’s reputation. Prospective students and their parents may worry about attending a university that is known for leaking personal data. Due to the high levels of media coverage, this breach could affect UEA’s admission rates for quite some time.

How can this be prevented?

The key tool here is knowledge. Providing training on good data protection practice can help staff stay on top of breach risks. The more aware they are, the more mindful they will be when handling personal data. Training can also help staff put their own preventative measures into place. For example, adding a ‘grace period’ to your email account. This puts a delay on your sent emails, allowing you to cancel them if you realise they are being sent to the wrong person, or hold the wrong information.

Other actions include adding the email recipient last, after writing the email and checking attachments. This gives the staff chance to double check emails and stops incomplete messages from being sent. A motto we encourage among our customers is “check twice, send once.” Taking the time to review a piece of work, saves time and stress in the long run.

What about when breaches can’t be prevented?

It’s important to note that breaches will happen. However hard your organisation works at prevention, mistakes happen to the best of us. Therefore, it is important to put a plan in place to mitigate the effects of a breach. Having a clear pathway for recording and managing breaches can ensure an issue doesn’t spiral into a reportable offence. Once again, knowledge is key. Make sure that members of your organisation know how to recognise a breach, and they know exactly who to tell. When you discover a breach, the clock is ticking. Not only do you have just 72 hours to report major breaches to the ICO, but the faster you act on a breach, the more effectively you can contain the impact. To avoid compensation pay-outs, a quick and efficient management process is vital.

 

 

The GDPR may feel like a hassle to many, but for all the students who entrust their data to educational institutions, following the regulations is integral to their wellbeing. The University of East Anglia has reminded us just how important data protection is. Effective management, good training, and staff awareness are all incredibly important.

Last week’s post briefly touched on how technological advances are providing new data protection challenges. Earlier this month, the 2020 Consumer Electronics Show (CES) showed the world the new smart devices we can expect to see filtering through the markets soon.

There were many companion robots on show. For instance, Bellabot (a robot cat waiter that purrs If you stroke its ears), and Lovot (a cuddly robot dedicated to imitating the behaviour of a living being). However, it is the ‘smart home’ developments seen at CES that set off the data protection alarm bells.

Take PantryOn’s smart shelves. A diligent individual with a bag of receipts could potentially track your purchase history, but it would be a lot of work. With automated shopping lists that update when your pantry is looking empty, all your purchasing behaviour is neatly held in one place. This just adds to the vast array of personal data that is only one breach away from being public knowledge.

The details of our grocery shopping are probably not high-risk personal data, but the data processed by Yukai’s Bocco robots is more worrying. Marketed as both a children’s companion and smart device, the Bocco robot can send and receive messages, and pair with sensors to share when a family member arrives home, what the weather is doing, and whether the front door has been closed correctly. The Bot also gets ‘excited’ when a family member’s birthday is nearing.

Personal details, correspondence, dates of birth, location data and access to smart devices. Yukai’s Bocco Bot can create quite a detailed picture of your habits. If this data set is compromised, it might offer an uncomfortably close view of your family life to the rest of the world.

What this means for education:

Its unlikely your budget will have space for personal robot companions, but smart technology is already in the classroom. Lecture recording products such as Echo 360 and PanOpto are regularly found in universities. Schools are using video based CPD software to help with their teacher training. The question of who owns the personal data these videos represent, has already created issues where teaching staff have wanted to take materials from one school to another.

Artificial Intelligence was previously a science fiction fantasy. Now it is being put into classroom management software. AI is now suggesting the best seating plan based on recorded information about students strengths, weaknesses and behaviour.

These developments in technology increase the volume of personal data being processed. It also brings in the potential for automated decision making. There are concerns about the systems being compromised or unavailable, but the largest issue is the increased complexity of subject access requests. Footage or audio recordings made by staff will hold personal data about students and this data is time consuming to extract and redact.

What it means for you:

We need to reach back briefly to the fundamentals of data protection. Personal data can be processed providing there is a lawful basis and that only data strictly necessary for the task is processed. Much new technology is designed to support teaching and learning by processing personal data. You may have to demonstrate that using that personal data is worth it for the educational benefit.

The key tool here is the Data Protection Impact Assessment (DPIA). The DPIA is designed to evaluate the risks to personal data and the benefits of a new initiative. It also looks at how the risks from an initiative can be mitigated. The process requires input from many sources and is one of the tasks that your data protection officer (DPO) is required to contribute to.

It is possible that the outcome of a DPIA will be that your proposed project does not go forward. If the project puts personal data at risk, and provides little benefit, it might be best to stick with your current method. However, that is an uncommon outcome. What you will have is the confidence that as the smart classroom develops, your data protection will develop with it.

We don’t usually comment on cybersecurity stories but the breaking news of the issues at Travelex (as reported by the BBCmade me think about the potential loss of access to critical information in an educational setting. 

From the information available ransomware has been placed in the Travelex system, forcing the company to shut down its’ online currency exchange service and revert to paper and pen across all of its’ sites. 

Ransomware attacks form a relatively small percentage of breaches reported to the ICO (39 in in the second quarter of 2019/20), but the education sector accounts for almost 20% of those reports. 

Security provider Palo Alto, in its’ Cyberpedia resource, lists the three most common attack methods for ransomware as visiting compromised websites, malicious email attachments and malicious email links. Many of you will have received one of these emails, which have become increasingly credible in recent months. 

If you do get caught by an attack there are various companies who say that they can help you recover without paying the ransom. However, it’s not uncommon for people to pay what’s demanded. 

Ransomware in the Education Setting

In a school situation, a locked-up machine may restrict much more than access to your favourite games and websites. Many people hold large amounts of data on local drives, much of it including personal data and sometimes of a highly confidential nature. More recent attacks have also been shown to steal data as well as encrypt the files on the machine. Even if the ransom is paid, the data may have already been retrieved. Imagine all that information about staff or students being published online for anyone to see. 

From a broader data protection perspective, the first question to ask is: Have you employed data protection by design and default? We know that data on a PC is vulnerable to this type of attack. If it’s personal data, what steps can be taken to remove the risk? 

How to Protect Against Availability Breaches

Moving data off the local machine and onto some form of network storage is the obvious answer. Whether that’s in a local server or in the cloud, if the machine is compromised, moving away from local storage provides an additional level of protection. 

This also solves the problems created by the use of memory sticks, an area of reportable breaches where education leads the field. If data is available online, then even if you move from location to location, it’s there for you when you need it. 

It’s also worth considering whether some of the personal data should ever be leaving core systems. Most people download information because they want to manipulate it, but often this analysis can be done in the system itself.  

Everyone in an organisation needs to be aware of the risk of clicking on links and attachments if they’re not sure of the origin. A golden rule is, if you’re not expecting an email think carefully before acting on it. If in doubt go the supposed sender (through a different contact route) and see if it was intended. 

It’s worth remembering that computers can be lost, stolen or simply break down. It’s not only ransomware that can make the data on them unavailable. Good data protection thinking means that even if you can’t use a particular device, critical data should always be available to the people who need it. 

The UK government has not had a fantastic start to the year. The New Year’s honours list, a list of individuals receiving awards on New Year’s Day, was mistakenly posted with personal contact details of over a thousand people. While the document was only available for around an hour, many notable—and often controversial—figures had their full addresses listed. The singer Elton John, baker Nadiya Hussain, and former director of public prosecutions Alison Saunders, were all included in the breach. Starting the new decade in the Information Commissioner’s Office’s doghouse, The Government is already playing catch-up.

However, they won’t be the only one. Without strong data protection policies and practices, breaches are inevitable. So, while diets and fitness plans may have already bit the dust, building a strong framework for GDPR compliance should be a New Year’s resolution that lasts.

Resolution 1: Perfect your data mapping

Data Mapping has been a curse for administrative staff across the EU. Yet, the benefits of keeping accurate records could not be clearer. Data mapping is a requirement within the GDPR, but it also comes in handy in the event of a Subject Access Request or breach. For instance, if a fire occurred in the archive room, a record of all documents held in the archive room will help with recovery.

Knowing exactly where all your data is held can reduce the strain when a problem occurs. As a New Year’s resolution, precise data mapping is a must-have.

Resolution 2: Learn to Recognise a Breach

The breaches seen most on the news are caused by cyber-hacks, or ransomware. Incidents such as the Travelex ransomware debacle often make the headlines. However, breaches caused by human error are much more likely, and usually are a lot harder to spot. Learning to recognise possible breaches quickly, means you can manage and mitigate them before they cause a problem. Issues such as missing files or incorrect information, are often ignored. and are often left unreported by staff, due to a fear of being reprimanded. Without a positive culture around data protection, it’s likely you’ll end up dealing with more serious consequences from breaches.

When moving your organisation forward this year, encouraging an atmosphere where staff feel able to speak up, should be a priority.

Resolution 3: Plan. Plan. Plan.

When an organisation discovers a breach, a ticking timer starts. If a breach is serious and needs to be reported to the Information Commissioner’s Office (ICO), it must be reported within 72 hours of discovery. This includes weekends and bank holidays. When a breach occurs, it is vital to have a tried and tested plan in place.

This year, make sure all your breaches are managed smoothly. Establish clear steps to report the breach internally, gather detailed information, and judge whether the breach is serious enough to be reported to the ICO. Ideas such as a specified email address for reporting breaches, and a designated team for managing them, can help your organisation have a stress-free 2020.

 

Resolution 4: Data Protection by Design

The final New Year’s resolution is about forward thinking. We must think about protecting personal data in the future, as well as right now. Advances in technology mean that organisations collect more and more personal data as we go about our days. For instance, Samsung’s Ballie Bot debuted this week. A tennis ball sized robot which follows its owner around. It captures ‘special moments’ via camera, and assists with personal fitness and household chores. While most organisations won’t be using miniature robots any time soon, new processes and technology can be expected over the 2020’s.

As we move into a new age, we must strive to perfect our GDPR compliance in the present, and design our advances with the protection of personal data at the forefront of our minds.

 

Santa Claus reading a list on a scroll. The background is grey with snowflakes.

The time has come. Tinsel is up, chestnuts are roasting, and Santa is preparing his “Naughty or Nice list”. However, in this time of tradition, should we be thinking of the new data protection laws? Is St Nick in breach of the GDPR?

Well, he might be. Having a list of all the boys and girls is not as simple as it used to be. Many schools have ended up in a pickle when parents asked for a class list to help their children write Christmas cards. It seems a pointless worry; kids know all their classmates already, but in terms of the GDPR, a school should not provide that personal data without consent. Not only would it breach the GDPR, but confirming a child attends a certain school could put that child’s safety at risk. Either way, if you’re asked for a list of names in a class, the answer will likely be no.

So, what about Santa? We know Santa has a list of every child in the world! In fact, as he sees when you are sleeping, and he knows when you’re awake, Santa is collecting personal data all the time. How do we know he’s holding our data responsibly? When he gives our wish lists to his elves, is he breaching our personal data?

That really depends on whether Santa is in scope of the GDPR. If Santa is an individual, who provides presents out of the goodness of his heart, then he doesn’t need to worry about our new data protection laws. However, if he and his helpers are an organisation, compliance should be at the top of his to-do list. He really should ask permission from us all to judge our worthiness for more than a lump of coal.

Unless of course, Santa is a public body. If we signed Santa’s role into law, Santa could perform his task for the good of the people. Much like schools, Santa would need a Data Protection Officer, regardless of the number of elves in his employ.  While he could try to justify his surveillance as a public task, he’d still need to record his data mapping. In fact, along with your wishlist, you could send a Subject Access Request up to the North Pole.

As for his list of all the girls and boys – if our “naughty or nice” rating is processed automatically, he should have explicit consent to do so! Even if Santa makes  all the decisions himself, a list of the name and address of every child in the world sounds like a massive breach risk.

 

However, as with everything in the GDPR, pragmatism is key. Similar to education institutions, it’s important to find a compromise where compliance is met, and the organisation can still run. In terms of Christmas cards and class lists, you can encourage children to write their own lists. There’s nothing wrong with an individual looking around and writing down the names of the people in the room.

Where Santa is concerned, he appears to be an ageless and all-knowing creature with the ability to travel faster than light. We can probably trust him with our personal data. Although, if Santa is keen on staying compliant, keep an eye out for a privacy notice flying down the chimney soon.

A report published last week by career focused social network LinkedIn, identified the “Emerging Jobs” of 2020 in the UK. The report, which can be found here, looks at the roles that are experiencing significant growth.

At number one is “Artificial Intelligence Specialist”, confirming that this field is expanding out of the academic realm and into the mainstream. At number two, having experienced a 24-fold increase since 2015 is the role of Data Protection Officer.

Of course, one of the subjects exercising the minds of these new Data Protection Officers is the work of Artificial Intelligence Specialists. In the Education sector, in addition to being a subject of research and development, there is a great deal of interest in whether AI can lead a revolution in tailored teaching and learning.

Last year, alongside the GDPR, the EU produced a well-considered but concerning report called The Impact of Artificial Intelligence on Learning, Teaching, and Education. The report concluded there was great potential for AI in education, but that potential brought with it some significant risks.

We’re not talking about machines with human like intelligence just yet. Instead, the application of machine learning, grounded in the massive amount of data generated on a continuous basis. By 2025 the amount of data generated globally is estimated to reach 175 zettabytes – enough such that at standard internet speeds it would take nearly 2 billion years to download it all! (IDC, 2018).

There are some questions about this AI approach. Firstly, the principle of machine learning is that the algorithms are based on historical data. This means that the criteria for measuring success may be based on data with inherent bias. It’s also true that the past is not necessarily a good guide to the future. Such techniques don’t cope well with creativity and innovation.

But perhaps a greater concern is the fact that AI driven systems will find employment not just in teaching but in assessment. ‘Big Data’ driven machine learning can create highly detailed categories of outcomes and place people into them on the basis of their behaviour. The scope of data used in these algorithms would also be much larger than current techniques. There are suggestions that in addition to student responses, video footage could also be used to judge their level of engagement and behaviour.

What can clearly be seen at the end of this process, is the possibility of classroom technology that would manage both delivery of material and assessments of progress. It’s easy to see how this would be attractive across many settings, including the growing world of online learning.

How this Links to Data Protection

Coming back to the number 2 emerging role, the Data Protection Officer, these types of AI proposals create some major challenges. The Information Commissioner’s Office has an open Consultation on the use of AI in processes that make decisions about individuals. One significant question often raised; is how you explain to data subjects the decision-making process of your AI driven system.

Although the processes are based on algorithms, the sheer number of factors taken into account mean it can be hard for anyone, other than the AI specialists, to understand how the decisions are made. As a data controller, if you’re using such a system, you need to be able to explain how it works in a straightforward enough way for your data subjects to understand.

There are suggestions that the data about a students’ attendance, achievement, behaviour and progress could also be analysed to provide clues that there may be other issues in their lives affecting their ability to learn. This could include problems at home or other personal issues. While this could help identify the need for lifesaving support and prevention methods, there could be significant ramifications from a data protection standpoint.

As we look forward to a new decade it’s essential we recognise that the full implications of data driven education technology.

The main vehicle for this is the data protection impact assessment (DPIA). Alongside the standard considerations of the way that data is transferred, evaluated and managed, the DPIA needs to take a broader view of the initiative.

It’s worth remembering that just because a system can perform a particular function, it doesn’t mean that you must use that capability. Decisions made about the path any student should follow can have life changing consequences. The prospect that these decisions being made in part by algorithms that are difficult to explain should give us all some pause for thought.

As AI begins to appear in the mainstream, it is pleasing to see a rise in Data Protection Officers. So, we welcome the top two emerging roles and can see a long future of discussion and debate about pragmatic use of AI in education.

 

 

While the Christmas holidays are tantalisingly close, many schools are struggling with the norovirus outbreak that is sweeping across the country. It got us thinking about the way that winter can leave us feeling washed out, both physically and mentally and how that could have an impact on more than just the mood at work.

Short winter days leave us sleepy, often hungry and lower in mood and that’s not even considering those individuals who suffer from the more serious Seasonal Affective Disorder. The causes of these issues are poorly understood, but in general, our bodies expect to sleep when it’s dark and be active when it’s light. Our fixed working patterns don’t allow us to accommodate this type of change.

You might ask, how does this relate to data protection? We know that how people feel has an impact on the tasks they must perform. Tiredness tends to lead to errors in judgement which, when dealing with personal data, can lead to data breaches.

Add to this the many activities that everyone is trying to get done before the Christmas break and you have a recipe for mistakes. We talked about this in our white paper Taking Control of Data Breaches.

Given that it’s probably fairly difficult to relocate your workplace to somewhere closer to the equator (that has more even daylengths), what can be done to avoid the data protection winter blues?

  1. Recognition: We’re now close to the shortest day, but did you know that the third Monday of January is considered by some to be the most depressing day of the year. If this is true, then we need to accept that December and January may represent periods of higher risk. Recognising risk is critical to mitigating it.
  2. Resourcing: It’s unlikely that you’ll be able to put off tasks until the spring, so you may need to consider how to ensure they are completed without issue. Simple steps like having a second pair of eyes checking before a large group email is sent or moving to envelopes with windows to avoid mismatched labels can make a huge difference, but you may have to prioritise.
  3. Rest: Once the break arrives, if you can, take the chance to switch off for a while. We’ll be thinking about data protection after the Turkey, but hopefully you won’t have to.
  4. Resolutions: With the new year it’s time to remind people of their responsibilities. But, rather than presenting to people; get them to consider what might ensure that all of your personal data is managed to plan.
  5. Remember, YOU are important too! Self-care is important for all of us, however, according to a recent report by Education Support, 75% of all education staff have faced physical or mental health issues within the past 2 years due to their work. So, taking some time out for you is vital to keeping your spirits up. Some things to help with self-care include listening to music, waking 15 minutes before you need to and practicing breathing techniques.

We can’t say whether any of these will guarantee you don’t make mistakes, but maybe you can have some fun finding out.

 

 

When you talk about data protection all day, every day, it’s easy to assume that everyone else does the same. Some of the terms and names used when referring to the new GDPR are not as clearly defined as they could be. So, this week we are looking at the role of a ‘data processor’ and answering some of the frequently asked questions about this role and its relationship to a ‘data controller’.

What’s a data processor?

Imagine for a moment that you’re running a school. You have all sorts of personal data about individuals including the pupils, their parents and your staff. You’re in charge of collecting, storing, checking, manipulating and outputting all that personal data. You’re also in charge of ensuring that the personal data is properly protected. Having these responsibilities earns you the title of ‘data controller’.

All the things that you can do with the personal data are bundled up and called ‘processing’. This means that whatever you do with personal data, even just storing it is considered ‘processing’.

You can manage all this processing yourself or you can choose to ask another organisation to undertake some of this processing on your behalf. You provide the personal data and clear instructions of what you want the third party to do with the data. If the third party is processing your personal data based only on your instructions, then it earns the title of ‘data processor’. It can be helpful to think about data processors as subcontractors for jobs that you would otherwise do yourself.

Can I be a data controller and a data processor? 

The simple answer is yes. In fact, it’s very likely that most data processors will be data controllers at the same time. The data processor is likely to have personal data about its own staff and customers and it will decide how that data is processed. This makes it a data controller.

If you’re a data controller it doesn’t follow that you’ll be a data processor. This only comes if you’re providing a service to third parties. It is worth noting if you’re running a Multi-Academy Trust, that where you provide processing of data for the academies you are not considered a “data processor”.

Are my staff data processors? 

This is a question we’ve been asked quite a few times recently. Your staff are processing data based on your instructions. However, your own staff are not considered to be third parties in the legal sense, so any processing they do is part of the operation of the data controller.

If you use staff that you don’t have a direct contract of employment (or an equivalent agreement) with, for example agency staff who are paid by the agency, then the agency is acting as a data processor.

If a data processor has a data breach, what happens?

You’ve chosen to sub-contract some of your data processing to a third party, but it’s still your data that’s being processed and it’s your responsibility. If the data processor experiences a data breach, then the risk is to your data subjects.

So, if it does happen, the data processor tells you what occurred and gives you the support needed to manage the impacts of the breach on your data subjects. If you must report the breach to the ICO, the data processor must give you the information you need to make that report.

While you might have a commercial agreement in place with the data processor, as far as the ICO is concerned, you have a responsibility to ensure you choose a supplier who would have appropriate data protection measures in place. If those measures fail, then it’s your choice of supplier that’s at question.

If I run a system on my site, why can the supplier still be a data processor? 

If the supplier of a piece of software can access the software in operation such that personal data is accessible, then they are classed a “data processor”. The most frequent way this happens is through the provision of support.

Whether an agent comes to your site or connects online, they can usually see the data in the system. This is more than enough for the support agent to discover personal data about your data subject which may get disclosed inappropriately.

If you get a support contract with this type of assistance you need to ensure you have the right terms in your agreement.

What do I need to do to protect my data subjects? 

If you’ve decided to use a data processor then you need to:

  • Have a written agreement that sets out how the data processor will deal with your data and that it will be compliant with the GDPR.
  • Satisfy yourself that the data processor can deliver the safeguards that it is claiming to have.
  • Include the existence of the data processor in your data map.

 

I hope this has given you some clarification on the difference between a data controller and a data processor, and the responsibilities each hold regarding the GDPR.

These a just a few of the questions about data processors and data controllers and their responsibilities under the GDPR. If you’d like us to take a look at another area of the regulation, let us know by getting in touch here.