A report published last week by career focused social network LinkedIn, identified the “Emerging Jobs” of 2020 in the UK. The report, which can be found here, looks at the roles that are experiencing significant growth.
At number one is “Artificial Intelligence Specialist”, confirming that this field is expanding out of the academic realm and into the mainstream. At number two, having experienced a 24-fold increase since 2015 is the role of Data Protection Officer.
Of course, one of the subjects exercising the minds of these new Data Protection Officers is the work of Artificial Intelligence Specialists. In the Education sector, in addition to being a subject of research and development, there is a great deal of interest in whether AI can lead a revolution in tailored teaching and learning.
Last year, alongside the GDPR, the EU produced a well-considered but concerning report called The Impact of Artificial Intelligence on Learning, Teaching, and Education. The report concluded there was great potential for AI in education, but that potential brought with it some significant risks.
We’re not talking about machines with human like intelligence just yet. Instead, the application of machine learning, grounded in the massive amount of data generated on a continuous basis. By 2025 the amount of data generated globally is estimated to reach 175 zettabytes – enough such that at standard internet speeds it would take nearly 2 billion years to download it all! (IDC, 2018).
There are some questions about this AI approach. Firstly, the principle of machine learning is that the algorithms are based on historical data. This means that the criteria for measuring success may be based on data with inherent bias. It’s also true that the past is not necessarily a good guide to the future. Such techniques don’t cope well with creativity and innovation.
But perhaps a greater concern is the fact that AI driven systems will find employment not just in teaching but in assessment. ‘Big Data’ driven machine learning can create highly detailed categories of outcomes and place people into them on the basis of their behaviour. The scope of data used in these algorithms would also be much larger than current techniques. There are suggestions that in addition to student responses, video footage could also be used to judge their level of engagement and behaviour.
What can clearly be seen at the end of this process, is the possibility of classroom technology that would manage both delivery of material and assessments of progress. It’s easy to see how this would be attractive across many settings, including the growing world of online learning.
How this Links to Data Protection
Coming back to the number 2 emerging role, the Data Protection Officer, these types of AI proposals create some major challenges. The Information Commissioner’s Office has an open Consultation on the use of AI in processes that make decisions about individuals. One significant question often raised; is how you explain to data subjects the decision-making process of your AI driven system.
Although the processes are based on algorithms, the sheer number of factors taken into account mean it can be hard for anyone, other than the AI specialists, to understand how the decisions are made. As a data controller, if you’re using such a system, you need to be able to explain how it works in a straightforward enough way for your data subjects to understand.
There are suggestions that the data about a students’ attendance, achievement, behaviour and progress could also be analysed to provide clues that there may be other issues in their lives affecting their ability to learn. This could include problems at home or other personal issues. While this could help identify the need for lifesaving support and prevention methods, there could be significant ramifications from a data protection standpoint.
As we look forward to a new decade it’s essential we recognise that the full implications of data driven education technology.
The main vehicle for this is the data protection impact assessment (DPIA). Alongside the standard considerations of the way that data is transferred, evaluated and managed, the DPIA needs to take a broader view of the initiative.
It’s worth remembering that just because a system can perform a particular function, it doesn’t mean that you must use that capability. Decisions made about the path any student should follow can have life changing consequences. The prospect that these decisions being made in part by algorithms that are difficult to explain should give us all some pause for thought.
As AI begins to appear in the mainstream, it is pleasing to see a rise in Data Protection Officers. So, we welcome the top two emerging roles and can see a long future of discussion and debate about pragmatic use of AI in education.
Do Students Dream of Electric Teachers?
A point of viewLast week’s post briefly touched on how technological advances are providing new data protection challenges. Earlier this month, the 2020 Consumer Electronics Show (CES) showed the world the new smart devices we can expect to see filtering through the markets soon.
There were many companion robots on show. For instance, Bellabot (a robot cat waiter that purrs If you stroke its ears), and Lovot (a cuddly robot dedicated to imitating the behaviour of a living being). However, it is the ‘smart home’ developments seen at CES that set off the data protection alarm bells.
Take PantryOn’s smart shelves. A diligent individual with a bag of receipts could potentially track your purchase history, but it would be a lot of work. With automated shopping lists that update when your pantry is looking empty, all your purchasing behaviour is neatly held in one place. This just adds to the vast array of personal data that is only one breach away from being public knowledge.
The details of our grocery shopping are probably not high-risk personal data, but the data processed by Yukai’s Bocco robots is more worrying. Marketed as both a children’s companion and smart device, the Bocco robot can send and receive messages, and pair with sensors to share when a family member arrives home, what the weather is doing, and whether the front door has been closed correctly. The Bot also gets ‘excited’ when a family member’s birthday is nearing.
Personal details, correspondence, dates of birth, location data and access to smart devices. Yukai’s Bocco Bot can create quite a detailed picture of your habits. If this data set is compromised, it might offer an uncomfortably close view of your family life to the rest of the world.
What this means for education:
Its unlikely your budget will have space for personal robot companions, but smart technology is already in the classroom. Lecture recording products such as Echo 360 and PanOpto are regularly found in universities. Schools are using video based CPD software to help with their teacher training. The question of who owns the personal data these videos represent, has already created issues where teaching staff have wanted to take materials from one school to another.
Artificial Intelligence was previously a science fiction fantasy. Now it is being put into classroom management software. AI is now suggesting the best seating plan based on recorded information about students strengths, weaknesses and behaviour.
These developments in technology increase the volume of personal data being processed. It also brings in the potential for automated decision making. There are concerns about the systems being compromised or unavailable, but the largest issue is the increased complexity of subject access requests. Footage or audio recordings made by staff will hold personal data about students and this data is time consuming to extract and redact.
What it means for you:
We need to reach back briefly to the fundamentals of data protection. Personal data can be processed providing there is a lawful basis and that only data strictly necessary for the task is processed. Much new technology is designed to support teaching and learning by processing personal data. You may have to demonstrate that using that personal data is worth it for the educational benefit.
The key tool here is the Data Protection Impact Assessment (DPIA). The DPIA is designed to evaluate the risks to personal data and the benefits of a new initiative. It also looks at how the risks from an initiative can be mitigated. The process requires input from many sources and is one of the tasks that your data protection officer (DPO) is required to contribute to.
It is possible that the outcome of a DPIA will be that your proposed project does not go forward. If the project puts personal data at risk, and provides little benefit, it might be best to stick with your current method. However, that is an uncommon outcome. What you will have is the confidence that as the smart classroom develops, your data protection will develop with it.
Your Organisation and the Prisoner of Ransomware
A point of viewWe don’t usually comment on cybersecurity stories but the breaking news of the issues at Travelex (as reported by the BBC) made me think about the potential loss of access to critical information in an educational setting.
From the information available ransomware has been placed in the Travelex system, forcing the company to shut down its’ online currency exchange service and revert to paper and pen across all of its’ sites.
Ransomware attacks form a relatively small percentage of breaches reported to the ICO (39 in in the second quarter of 2019/20), but the education sector accounts for almost 20% of those reports.
Security provider Palo Alto, in its’ Cyberpedia resource, lists the three most common attack methods for ransomware as visiting compromised websites, malicious email attachments and malicious email links. Many of you will have received one of these emails, which have become increasingly credible in recent months.
If you do get caught by an attack there are various companies who say that they can help you recover without paying the ransom. However, it’s not uncommon for people to pay what’s demanded.
Ransomware in the Education Setting
In a school situation, a locked-up machine may restrict much more than access to your favourite games and websites. Many people hold large amounts of data on local drives, much of it including personal data and sometimes of a highly confidential nature. More recent attacks have also been shown to steal data as well as encrypt the files on the machine. Even if the ransom is paid, the data may have already been retrieved. Imagine all that information about staff or students being published online for anyone to see.
From a broader data protection perspective, the first question to ask is: Have you employed data protection by design and default? We know that data on a PC is vulnerable to this type of attack. If it’s personal data, what steps can be taken to remove the risk?
How to Protect Against Availability Breaches
Moving data off the local machine and onto some form of network storage is the obvious answer. Whether that’s in a local server or in the cloud, if the machine is compromised, moving away from local storage provides an additional level of protection.
This also solves the problems created by the use of memory sticks, an area of reportable breaches where education leads the field. If data is available online, then even if you move from location to location, it’s there for you when you need it.
It’s also worth considering whether some of the personal data should ever be leaving core systems. Most people download information because they want to manipulate it, but often this analysis can be done in the system itself.
Everyone in an organisation needs to be aware of the risk of clicking on links and attachments if they’re not sure of the origin. A golden rule is, if you’re not expecting an email think carefully before acting on it. If in doubt go the supposed sender (through a different contact route) and see if it was intended.
It’s worth remembering that computers can be lost, stolen or simply break down. It’s not only ransomware that can make the data on them unavailable. Good data protection thinking means that even if you can’t use a particular device, critical data should always be available to the people who need it.
GDPR New Year’s Resolutions
A point of viewThe UK government has not had a fantastic start to the year. The New Year’s honours list, a list of individuals receiving awards on New Year’s Day, was mistakenly posted with personal contact details of over a thousand people. While the document was only available for around an hour, many notable—and often controversial—figures had their full addresses listed. The singer Elton John, baker Nadiya Hussain, and former director of public prosecutions Alison Saunders, were all included in the breach. Starting the new decade in the Information Commissioner’s Office’s doghouse, The Government is already playing catch-up.
However, they won’t be the only one. Without strong data protection policies and practices, breaches are inevitable. So, while diets and fitness plans may have already bit the dust, building a strong framework for GDPR compliance should be a New Year’s resolution that lasts.
Resolution 1: Perfect your data mapping
Data Mapping has been a curse for administrative staff across the EU. Yet, the benefits of keeping accurate records could not be clearer. Data mapping is a requirement within the GDPR, but it also comes in handy in the event of a Subject Access Request or breach. For instance, if a fire occurred in the archive room, a record of all documents held in the archive room will help with recovery.
Knowing exactly where all your data is held can reduce the strain when a problem occurs. As a New Year’s resolution, precise data mapping is a must-have.
Resolution 2: Learn to Recognise a Breach
The breaches seen most on the news are caused by cyber-hacks, or ransomware. Incidents such as the Travelex ransomware debacle often make the headlines. However, breaches caused by human error are much more likely, and usually are a lot harder to spot. Learning to recognise possible breaches quickly, means you can manage and mitigate them before they cause a problem. Issues such as missing files or incorrect information, are often ignored. and are often left unreported by staff, due to a fear of being reprimanded. Without a positive culture around data protection, it’s likely you’ll end up dealing with more serious consequences from breaches.
When moving your organisation forward this year, encouraging an atmosphere where staff feel able to speak up, should be a priority.
Resolution 3: Plan. Plan. Plan.
When an organisation discovers a breach, a ticking timer starts. If a breach is serious and needs to be reported to the Information Commissioner’s Office (ICO), it must be reported within 72 hours of discovery. This includes weekends and bank holidays. When a breach occurs, it is vital to have a tried and tested plan in place.
This year, make sure all your breaches are managed smoothly. Establish clear steps to report the breach internally, gather detailed information, and judge whether the breach is serious enough to be reported to the ICO. Ideas such as a specified email address for reporting breaches, and a designated team for managing them, can help your organisation have a stress-free 2020.
Resolution 4: Data Protection by Design
The final New Year’s resolution is about forward thinking. We must think about protecting personal data in the future, as well as right now. Advances in technology mean that organisations collect more and more personal data as we go about our days. For instance, Samsung’s Ballie Bot debuted this week. A tennis ball sized robot which follows its owner around. It captures ‘special moments’ via camera, and assists with personal fitness and household chores. While most organisations won’t be using miniature robots any time soon, new processes and technology can be expected over the 2020’s.
As we move into a new age, we must strive to perfect our GDPR compliance in the present, and design our advances with the protection of personal data at the forefront of our minds.
Has GDPR Stolen Christmas?
A point of viewThe time has come. Tinsel is up, chestnuts are roasting, and Santa is preparing his “Naughty or Nice list”. However, in this time of tradition, should we be thinking of the new data protection laws? Is St Nick in breach of the GDPR?
Well, he might be. Having a list of all the boys and girls is not as simple as it used to be. Many schools have ended up in a pickle when parents asked for a class list to help their children write Christmas cards. It seems a pointless worry; kids know all their classmates already, but in terms of the GDPR, a school should not provide that personal data without consent. Not only would it breach the GDPR, but confirming a child attends a certain school could put that child’s safety at risk. Either way, if you’re asked for a list of names in a class, the answer will likely be no.
So, what about Santa? We know Santa has a list of every child in the world! In fact, as he sees when you are sleeping, and he knows when you’re awake, Santa is collecting personal data all the time. How do we know he’s holding our data responsibly? When he gives our wish lists to his elves, is he breaching our personal data?
That really depends on whether Santa is in scope of the GDPR. If Santa is an individual, who provides presents out of the goodness of his heart, then he doesn’t need to worry about our new data protection laws. However, if he and his helpers are an organisation, compliance should be at the top of his to-do list. He really should ask permission from us all to judge our worthiness for more than a lump of coal.
Unless of course, Santa is a public body. If we signed Santa’s role into law, Santa could perform his task for the good of the people. Much like schools, Santa would need a Data Protection Officer, regardless of the number of elves in his employ. While he could try to justify his surveillance as a public task, he’d still need to record his data mapping. In fact, along with your wishlist, you could send a Subject Access Request up to the North Pole.
As for his list of all the girls and boys – if our “naughty or nice” rating is processed automatically, he should have explicit consent to do so! Even if Santa makes all the decisions himself, a list of the name and address of every child in the world sounds like a massive breach risk.
However, as with everything in the GDPR, pragmatism is key. Similar to education institutions, it’s important to find a compromise where compliance is met, and the organisation can still run. In terms of Christmas cards and class lists, you can encourage children to write their own lists. There’s nothing wrong with an individual looking around and writing down the names of the people in the room.
Where Santa is concerned, he appears to be an ageless and all-knowing creature with the ability to travel faster than light. We can probably trust him with our personal data. Although, if Santa is keen on staying compliant, keep an eye out for a privacy notice flying down the chimney soon.
AI in Education: A Brave New World?
A point of viewA report published last week by career focused social network LinkedIn, identified the “Emerging Jobs” of 2020 in the UK. The report, which can be found here, looks at the roles that are experiencing significant growth.
At number one is “Artificial Intelligence Specialist”, confirming that this field is expanding out of the academic realm and into the mainstream. At number two, having experienced a 24-fold increase since 2015 is the role of Data Protection Officer.
Of course, one of the subjects exercising the minds of these new Data Protection Officers is the work of Artificial Intelligence Specialists. In the Education sector, in addition to being a subject of research and development, there is a great deal of interest in whether AI can lead a revolution in tailored teaching and learning.
Last year, alongside the GDPR, the EU produced a well-considered but concerning report called The Impact of Artificial Intelligence on Learning, Teaching, and Education. The report concluded there was great potential for AI in education, but that potential brought with it some significant risks.
We’re not talking about machines with human like intelligence just yet. Instead, the application of machine learning, grounded in the massive amount of data generated on a continuous basis. By 2025 the amount of data generated globally is estimated to reach 175 zettabytes – enough such that at standard internet speeds it would take nearly 2 billion years to download it all! (IDC, 2018).
There are some questions about this AI approach. Firstly, the principle of machine learning is that the algorithms are based on historical data. This means that the criteria for measuring success may be based on data with inherent bias. It’s also true that the past is not necessarily a good guide to the future. Such techniques don’t cope well with creativity and innovation.
But perhaps a greater concern is the fact that AI driven systems will find employment not just in teaching but in assessment. ‘Big Data’ driven machine learning can create highly detailed categories of outcomes and place people into them on the basis of their behaviour. The scope of data used in these algorithms would also be much larger than current techniques. There are suggestions that in addition to student responses, video footage could also be used to judge their level of engagement and behaviour.
What can clearly be seen at the end of this process, is the possibility of classroom technology that would manage both delivery of material and assessments of progress. It’s easy to see how this would be attractive across many settings, including the growing world of online learning.
How this Links to Data Protection
Coming back to the number 2 emerging role, the Data Protection Officer, these types of AI proposals create some major challenges. The Information Commissioner’s Office has an open Consultation on the use of AI in processes that make decisions about individuals. One significant question often raised; is how you explain to data subjects the decision-making process of your AI driven system.
Although the processes are based on algorithms, the sheer number of factors taken into account mean it can be hard for anyone, other than the AI specialists, to understand how the decisions are made. As a data controller, if you’re using such a system, you need to be able to explain how it works in a straightforward enough way for your data subjects to understand.
There are suggestions that the data about a students’ attendance, achievement, behaviour and progress could also be analysed to provide clues that there may be other issues in their lives affecting their ability to learn. This could include problems at home or other personal issues. While this could help identify the need for lifesaving support and prevention methods, there could be significant ramifications from a data protection standpoint.
As we look forward to a new decade it’s essential we recognise that the full implications of data driven education technology.
The main vehicle for this is the data protection impact assessment (DPIA). Alongside the standard considerations of the way that data is transferred, evaluated and managed, the DPIA needs to take a broader view of the initiative.
It’s worth remembering that just because a system can perform a particular function, it doesn’t mean that you must use that capability. Decisions made about the path any student should follow can have life changing consequences. The prospect that these decisions being made in part by algorithms that are difficult to explain should give us all some pause for thought.
As AI begins to appear in the mainstream, it is pleasing to see a rise in Data Protection Officers. So, we welcome the top two emerging roles and can see a long future of discussion and debate about pragmatic use of AI in education.
How to avoid the winter blues…
A point of viewWhile the Christmas holidays are tantalisingly close, many schools are struggling with the norovirus outbreak that is sweeping across the country. It got us thinking about the way that winter can leave us feeling washed out, both physically and mentally and how that could have an impact on more than just the mood at work.
Short winter days leave us sleepy, often hungry and lower in mood and that’s not even considering those individuals who suffer from the more serious Seasonal Affective Disorder. The causes of these issues are poorly understood, but in general, our bodies expect to sleep when it’s dark and be active when it’s light. Our fixed working patterns don’t allow us to accommodate this type of change.
You might ask, how does this relate to data protection? We know that how people feel has an impact on the tasks they must perform. Tiredness tends to lead to errors in judgement which, when dealing with personal data, can lead to data breaches.
Add to this the many activities that everyone is trying to get done before the Christmas break and you have a recipe for mistakes. We talked about this in our white paper Taking Control of Data Breaches.
Given that it’s probably fairly difficult to relocate your workplace to somewhere closer to the equator (that has more even daylengths), what can be done to avoid the data protection winter blues?
We can’t say whether any of these will guarantee you don’t make mistakes, but maybe you can have some fun finding out.
Data Controllers and Data Processors; Who’s who?
A point of viewWhen you talk about data protection all day, every day, it’s easy to assume that everyone else does the same. Some of the terms and names used when referring to the new GDPR are not as clearly defined as they could be. So, this week we are looking at the role of a ‘data processor’ and answering some of the frequently asked questions about this role and its relationship to a ‘data controller’.
What’s a data processor?
Imagine for a moment that you’re running a school. You have all sorts of personal data about individuals including the pupils, their parents and your staff. You’re in charge of collecting, storing, checking, manipulating and outputting all that personal data. You’re also in charge of ensuring that the personal data is properly protected. Having these responsibilities earns you the title of ‘data controller’.
All the things that you can do with the personal data are bundled up and called ‘processing’. This means that whatever you do with personal data, even just storing it is considered ‘processing’.
You can manage all this processing yourself or you can choose to ask another organisation to undertake some of this processing on your behalf. You provide the personal data and clear instructions of what you want the third party to do with the data. If the third party is processing your personal data based only on your instructions, then it earns the title of ‘data processor’. It can be helpful to think about data processors as subcontractors for jobs that you would otherwise do yourself.
Can I be a data controller and a data processor?
The simple answer is yes. In fact, it’s very likely that most data processors will be data controllers at the same time. The data processor is likely to have personal data about its own staff and customers and it will decide how that data is processed. This makes it a data controller.
If you’re a data controller it doesn’t follow that you’ll be a data processor. This only comes if you’re providing a service to third parties. It is worth noting if you’re running a Multi-Academy Trust, that where you provide processing of data for the academies you are not considered a “data processor”.
Are my staff data processors?
This is a question we’ve been asked quite a few times recently. Your staff are processing data based on your instructions. However, your own staff are not considered to be third parties in the legal sense, so any processing they do is part of the operation of the data controller.
If you use staff that you don’t have a direct contract of employment (or an equivalent agreement) with, for example agency staff who are paid by the agency, then the agency is acting as a data processor.
If a data processor has a data breach, what happens?
You’ve chosen to sub-contract some of your data processing to a third party, but it’s still your data that’s being processed and it’s your responsibility. If the data processor experiences a data breach, then the risk is to your data subjects.
So, if it does happen, the data processor tells you what occurred and gives you the support needed to manage the impacts of the breach on your data subjects. If you must report the breach to the ICO, the data processor must give you the information you need to make that report.
While you might have a commercial agreement in place with the data processor, as far as the ICO is concerned, you have a responsibility to ensure you choose a supplier who would have appropriate data protection measures in place. If those measures fail, then it’s your choice of supplier that’s at question.
If I run a system on my site, why can the supplier still be a data processor?
If the supplier of a piece of software can access the software in operation such that personal data is accessible, then they are classed a “data processor”. The most frequent way this happens is through the provision of support.
Whether an agent comes to your site or connects online, they can usually see the data in the system. This is more than enough for the support agent to discover personal data about your data subject which may get disclosed inappropriately.
If you get a support contract with this type of assistance you need to ensure you have the right terms in your agreement.
What do I need to do to protect my data subjects?
If you’ve decided to use a data processor then you need to:
I hope this has given you some clarification on the difference between a data controller and a data processor, and the responsibilities each hold regarding the GDPR.
These a just a few of the questions about data processors and data controllers and their responsibilities under the GDPR. If you’d like us to take a look at another area of the regulation, let us know by getting in touch here.
Careless Talk Causes Breaches
A point of view…(and can be costly too!)
GDPR is not normally associated with parties, but recently I heard the end of a conversation about an office Christmas party and it set me thinking about the impact that a misplaced sentence can have. Friendships and working relationships can be badly damaged, in some cases, irreparable.
If I choose to pass on my unvarnished opinion about a colleague during the Christmas bash, then I can find myself in a lot of trouble. If on the other hand, I whisper information that has come from the data controller then not only am I in hot water, but I’ve also given the extra present of a data breach.
Paragraph 4, Article 32 of the GDPR says:
Put more simply, you must ensure that people are given clear guidance about what they can and can’t do with personal data and you must ensure they stick to those rules.
Bear in mind that it doesn’t matter how information is disclosed for it to be a breach. Whether you’ve been hacked, sent an email to the wrong person, lost a paper file or repeated information to someone who shouldn’t know it, a breach has occurred.
With verbal disclosure the situation is often made worse by the fact that our natural desire is to share more ‘interesting’ information, which is also usually more confidential and leads to greater upset.
We’ve seen examples where incidents have been dealt with from a disciplinary standpoint but have gone unrecognised as a data breach. Obviously, if you need to report the breach to the ICO, you’ll have to explain why you missed the 72-hour deadline for reporting. It is difficult to say that you have a sound regime for data protection but missed this high-profile target.
What steps should you take to avoid these issues:
Training
Easy reporting
A response procedure
Joined up processes
So, as you contemplate the upcoming festivities, it may be worth a timely reminder to everyone that we have to consider what we’re saying just as much a what goes into an email.
How the Department for Education ran into trouble with the ICO
A point of viewBeing as clear as mud when it comes to Data Protection
A key principle of data protection is transparency. You must be upfront about what you plan to do with personal data.
A failure to be transparent has recently brought the Department for Education into the Information Commissioner’s Office’s sights. Information from the annual census returns was apparently shared with Immigration Officials, without the data subjects being informed. This is the opposite to what you should be doing.
According to an article in the Guardian the ICO position is that:
It’s not clear yet what the consequences for the DfE will be from these findings.
Just a few days before the news about the DfE broke, the Information Commissioner felt compelled to ensure that political parties understood their data protection responsibilities as we head towards the election in December.
In addition to telling the parties that they needed to follow the principles of data protection, she also specifically addressed the controversial issue from the Brexit Referendum and subsequent elections – advertising on social media. You can read the Commissioner’s full statement here.
These concerns are about transparency. How do you know what someone is doing with your personal data and how that usage might affect your rights and freedoms? The GDPR is very clear about the information that should be provided especially when your personal data is being used. It’s not always clear how well individuals understand the information presented to them, if they are given any at all.
The principle of transparency doesn’t just apply to political parties and government departments. It should be the cornerstone of the data protection policies and practices for every organisation.
So, what does Transparency mean for an organisation in the Education sector?
How can you demonstrate that you’re meeting these requirements?
Your data mapping, providing it follows the model set out by the ICO will address the first three and your privacy notices should address the last item.
Our experience is that many schools and colleges haven’t mapped their use of personal data to the level of detail that the regulation expects, and this could become a problem if a complaint is raised by a data subject.
You may know in detail how data is collected, stored, updated and shared, but the legislation requires that this is documented. Does your documentation fully cover the movement of personal data around the organisation?
Do your privacy notices strike the balance of informing individuals about how their data is used while being accessible and unambiguous.
While you try and figure out the fake news from the real around the election it may be a good time to ensure that you’re being properly open about the way you collect and use personal data.
If you are unsure how you process data, or would like some guidance on how to document this, please contact our GDPR experts on 0113 804 2035 or click here.
GDPR Problems Most People Are Facing
A point of viewAre you facing the same GDPR problems as most?
We asked a number of both Data Protection Officers and GDPR Leads in Education what the most common GDPR problems they come across are, interestingly, most of the answers were the same, so we thought we would put together a list of GDPR problems and some ways of how to resolve them.
1. Not Enough Time
Yes, we all struggle with it. Often, a DPO also has one or more other roles within the School, School Business Managers and Senior Leadership Teams are often the first point of call when seeking a new Data Protection Officer. These roles are often picked because the DPO is required to be a senior manager and needs a really good all-round understanding of the school. However, Business Managers have to co-ordinate much of the non-teaching activity and the members of the SLT have to combine teaching, line management and other development projects.
We can’t make extra time (if only!) but where we can help is to ensure that you focus on the highest priority items. Whether it’s through our GDPR training, getting visibility through the Sentry System or even getting us to take on some of the load,5 we can help you deal with GDPR more efficiently
2. Lack of GDPR Knowledge
It is no surprise that many DPO’s don’t understand the full complexities of GDPR requirements, such as data mapping, retention schedules, when a breach should be reported to the ICO and the web of complexity in a data protection impact assessment. The Data Protection Officer is responsible for advising on the interpretation of the regulations with all difficulties that real life can throw at a situation.
Here at GDPR Sentry, we offer GDPR Training, from basic staff awareness training to complex Data Protection Officer training to ensure you have all the knowledge you need for your compliance journey.
3. 24/7 Availability
DPO’s have holidays, and rightly so! When on annual leave, a Data Protection Officer is unlikely to want to be available to discuss breaches or the response to a subject access request. Most DPO’s would prefer not to have to be available at weekends and in the evenings. However, If a breach occurs and it needs to be reported to the ICO, this needs to be done within 72 hours of when the breach was discovered.
We recommend other members of your team are up to date with GDPR, so that in the event of absence, breaches and SARs can be dealt with efficiently and correctly. We can even provide out of hours or holiday cover to support the DPO.
4. Conflict of Interest
In the real world many decisions are driven by calculations of cost and benefit. The Data Protection Officer is expected to always put data protection first, even when this may create higher costs or cause issues for the organisation. For DPOs who are juggling more than one role this can create conflicts of interest where data protection is balanced against other priorities.
The DPO is meant to have no role in decision making about data processing at the same time as being a senior manager. Outside of very large organisations this is almost impossible to achieve. This is most evident during a data protection impact assessment.
We can support the risk assessment of an impact assessment ensuring that every angle is considered, and you have an objective perspective for your initiative.
5. Getting a Second Opinion
If you are a Data Protection Officer and you come across something you are unsure of, where do you turn? A lot of DPO’s we work with have admitted they feel like there is no support, they are expected to be the ‘go to’ person for all things data protection, however, due to a combination of the points above, unfortunately, a newly appointed DPO can’t know everything about such complex regulation.
If you can relate to any of the above GDPR problems, you are not alone. GDPR Sentry are here to support Data Protection Officers with their role, offering a range of one-off or ongoing support services. To speak to our GDPR Experts, click here.