It has now been over a year since Chinese authorities reported the first case of Covid-19 to the World Health Organisation. This year has brought tragedy for the many people who have lost loved ones. It’s also brought difficulties for all, with restrictions on our daily life that haven’t been seen since the 1940s.

Mental wellbeing and Data Protection?

We’ve had months of lockdown, daily emergency briefings reinforcing the gloomy outlook. We live under more tiers than a wedding cake, and enough Government U-Turns to make a driving instructor pale. Many of us are feeling long term fatigue even though our physical activity levels have dropped.

This is because we have all been facing the stress of COVID-19. When we face psychological stress, our bodies enter ‘fight or flight’ mode. Your heart rate goes up and raises blood pressure, whilst suppressing non-essential functions like digestion and your immune system.

This response is ideal if you’re being chased by a bear. However, with a long-term stressor like COVID-19 you’re on high alert for long periods. This takes a toll on your energy levels and leaves you at risk to the normal coughs and colds that are still circulating.

This chronic stress takes its greatest toll on our mental wellbeing. Poor sleep, depression, anxiety, and a loss of concentration are common results. A study by the ONS found that 1 in 5 adults reported depression, compared to 1 in 10 before the pandemic. Symptoms for those with pre-existing conditions have worsened and many have reported feelings of stress and anxiety.

What does this have to do with data protection?

It might seem strange that an opinion piece on GDPR is delving into the pandemic’s impact on mental health. However, poor mental wellbeing might put a strain on your data protection measures.

Several studies have found that sleep loss and fatigue can lead to increased risk taking, poorer reaction times, and lower levels of neural and data processing. Fatigued people are simply more likely to make a mistake. When handling personal data this can result in a data breach.

A higher risk of data breaches is concerning at any time, but it is even more worrying in current circumstances. Schools and colleges are providing education in a very difficult environment.

Just over a month ago, the UK government announced that secondary schools and colleges had the chance to set up mass asymptomatic testing of pupils and staff. Since the announcement, there has been a flurry of activity from the DfE, Teachers unions and the MHRA (Medicines and Healthcare products Regulatory Agency). While education leaders have been calling for testing for a long time, there have been understandable levels of apprehension.

Unlike pilot schemes, schools and colleges are to provide their own staff to assist with testing, cleaning, and administration. Of course, in the latest U-Turn, the government dropped repeated daily testing for students who had contact with a positive case.

The Secretary of State for Education confirmed that regular mass testing for staff and eventually returning pupils will continue. It is also likely to be extended to all settings. Unlike Test and Trace, schools will be responsible for processing test data. A totally novel process, schools and colleges will be processing and storing hundreds of items of special category data. That data will be held internally on pre-existing systems, not specifically designed for this process. Furthermore, schools will need to inform individuals who test positive. It is a veritable nightmare of new data protection risks.

Combine this with everything we’ve said about the increased risk of error, and maybe it makes more sense that a Data Protection company is talking about mental health in the pandemic.

Protecting yourself from Breaches

Unfortunately, there is no quick fix. Each day brings more vaccinations, and with it more hope for normality, but this has always been a marathon rather than a sprint. The best we can do is be aware of increased risk and put additional safety nets in place.  For instance, a quick guide for test administrators, that they can refer to should they become unsure, might reduce mistakes in administration, or in contacting individuals.

When writing procedures for test administration, building in checking stages can help as well. Making a double check part of the routine, will help staff catch mistakes they might make on the first run.

Even taking a few minutes to check in on your staff can make a huge difference. Just talking about a worry can reduce anxiety. It also gives you an opportunity to assess where additional support might be needed to prevent data protection problems.

We are all working hard to simply manage our daily lives while the landscape shifts every minute. This fatigue is understandable and normal. We just have to be aware of it, so we can try to minimise mistakes, and save ourselves from even more stress.

It has been a year of chaos. The Oxford English Dictionary usually nominate one word as ‘Word of the Year’. This year, there has been so much change that they couldn’t narrow it down to just one. “Covid-19, Lockdown, Anti-Maskers, Unmute” Not to forget “Bushfire” when millions of acres of Australian bushland burnt at the beginning of the year.

The point is, a lot has changed.

However, the more things change, the more they stay the same. Throughout all the madness, GDPR has not gone anywhere. As we reach another festive period, the same questions pop up again. Chiefly, “What about Christmas cards?”

Secondary school teachers, Universities and College staff, you probably get off lightly here! En Masse Christmas card delivery is usually a feature of primary schools. Unfortunately for primary schools, it can be a little complex.

 

The Christmas Card Conundrum

Children in your class can send Christmas cards. They’re (tiny) individuals, sending cards to other (tiny) individuals. Nothing wrong with spreading a little holiday cheer! However, many teachers get asked for a class list to make the process easier for parents…

…And you can’t give it to them. Giving out a class list is disclosing personal data. The task isn’t essential to running a school, so you’d have to get consent. Co-ordinating consent forms for the parents of hundreds of children? Not fun, and not easy.

So you can’t hand out a list.

“But how can the little Boys and Girls spread their festive cheer?”  you ask. Never fear, there is a solution.

There’s nothing stopping an individual from looking round the classroom, and writing down the names of their friends. They’re an individual, not an organisation, and all they’re doing is writing down publicly disclosed information (It’s hard to keep your identity secret when you’re sat in the second row). If you’re working with young children, it might even be a opportunity for some yuletide handwriting practice!

So Christmas isn’t cancelled, and with a little bit of sideways thinking you can still have Holiday Cards flying around the classroom! Although we’re not health and safety experts, so it’s your choice on the flying part.

The EU-US Privacy shield, a framework designed by the U.S. Department of Commerce, and the European commission, has been struck down by the European Court of Justice (ECJ).

The framework, approved by the EU in 2016, has been at the centre of several international discussions for the last few years. The program allowed companies to sign up and certify that their data protection practices were in line with EU law.

With the introduction of GDPR, the EU-US privacy shield acted as strong evidence that a company was holding data in a compliant manner, allowing organisations to justify using American firms, and sending data out of the EU to be processed.

However, European data protection law states that data should only be transferred out of the EU if there are appropriate safeguards in place. Due to differing laws on national security and law enforcement, the ECJ have now deemed that US domestic law does not provide appropriate safeguards for personal data, and that “Surveillance programs … [are] not limited to what is strictly necessary”

Therefore, the agreement between the European Union and the USA, has been struck down.

What does this mean?

In practical terms, this ruling could have a substantial impact on education settings. Most schools and colleges use various apps and games to facilitate learning, and many institutions will be using external software to manage HR and finance. Some of the apps and software used will be capturing personal data, hosted on servers within the US.

Previously, if a school or college wished to use a company hosting data outside of the EU. They could check whether the company was signed up to the Privacy Shield. The Shield provided organisations sufficient comfort that their data would be handled in a compliant manner. With that safeguard gone, it will be harder to prove that using a data processor from outside the EU is an appropriate action.

Moving Forward:

This is likely to affect a proportion, but not all, of your organisation’s suppliers. A list of companies signed onto the scheme can be found here, but here are a few examples used in the Education setting:

Mailchimp: The automated email company stores all data on servers in the US, including recipient and sender details.

Consilio: A firm supplying legal services and software, Consilio provides hosting, processing and analysis on data, and is hosted on servers in the US.

Egress: Egress provides secure email and file transfer for many organisations, including the NHS and many councils. However, Egress does have servers within the UK.

It’s worth noting that in some cases a supplier may have their main services based in the EU, but sub-processors or performance management functions are based in the US. Zoom is an example of this, but you can relax. They use Standard Contractual Clauses (SCCs) and not the Privacy Shield.

However, the access the US Government has to data hosted within the country is not prevented by the Standard Clauses, and the individual who brought this case to the ECJ has now taken aim at SSCs, so watch this space.

A review of your suppliers is now in order and you may have to make difficult decisions.

With more than 3,000 cases of the new coronavirus confirmed, Italy has announced that it will be shutting all schools for 10 days, to slow the spread of the disease. With cases beginning to increase in the UK, the possibility of similar action being taken here is also increasing.

Most students, teachers and lecturers are currently working on business as usual, but behind the scenes, admin staff are frantically devising plans for remote learning. As the spread of the coronavirus cannot be easily predicted, institutions need to be prepared to continue the working day from home at a moment’s notice. The difficulties of such an endeavour are more complex than you might expect.

An Unexpected Break:

In the event of mass school closure, few primary and secondary schools have the infrastructure to support remote learning. This is less of a worry for Higher Education, as many universities already have lecture streaming and recording facilities in place. They often also use virtual learning sites such as Moodle. Lots of University students already use their personal devices to access work, so while closures wouldn’t be ideal, they’d be manageable.

For schools, closures are much more likely to be an issue. Setting up a secure way to share lessons and resources online takes time. It often also takes money. For schools and their increasingly tighter budget, bespoke software and quick fixes are far out of reach. Various free or cheap file sharing sites such as SharePoint, Google Drive and Dropbox can make sending resources to students possible, but returning work and grades can easily become a messy affair.

Even if personal data is transferred between staff and students in a secure manner, it’s very difficult to control where it goes next. Most services offer the ability to download files onto your device, and personal devices are under far less scrutiny than those owned by schools and universities. SARS-Cov2 (the virus causing the disease), is not the only virus that threatens institutions. Should a teacher have a computer virus on their device, the data of their students could be compromised. Does this mean that schools need to provide malware protection for their employees? It’s another cost and another worry to add to the list.

Finding a Way for Face Time:

Assuming schools can find a secure solution to send work home, textbooks and tests aren’t a patch on a good teacher. File sharing can keep children learning long division at home, but what about ‘W plane transformations’ in A level Further Maths?

Many schools are trying to find a solution meaning teachers can still run lessons for their pupils. However, with little preparation time, schools are turning to use of personal accounts on media like WhatsApp and Skype. Anyone who has had any involvement in education, knows that this rings warning bells. Sharing personal accounts poses significant safeguarding risks. Regulating a private video call between staff and students is nigh on impossible. It would be the responsibility of the individual to record and report any issues, and it’s far too easy to say “I forgot to record the meeting”. Regardless of the safeguarding issues, should your call be recorded by the application you use, your personal data is being held by yet another data processor.

Increased Ratios. Increased Risk:

Should schools and universities remain open, Covid-19 could still cause problems. The Prime Minister has announced that relaxing rules on class sizes could be used to combat staff absences, should the virus take hold. While this could reduce issues with childcare and education, it’s likely to increase pressure on staff. If Students are moving class, their data is moving too. Teachers will not only be working under the increased pressure of cramped classrooms, but will be responsible for the safeguarding of more personal data than usual, and all of this whilst also being more stressed than usual. These are the ideal conditions for a serious data breach. Just working in a different classroom poses risks. When all this is over, no one will be completely sure where all the student and staff records have ended up.

A Necessary Compromise:

Every school, college and university across the country is desperately trying to find a balance. A situation where safeguarding is still prioritised, data protection is ensured, and students can still receive the quality of education they deserve. There is no simple answer, and there will be much relief when the crisis has passed.

For now, we will be closely watching the situation as it unfolds. Preparation is key, and according to Boris Johnson, a lathering of soap and singing Happy Birthday twice over, will save us all.

Last week, the University of East Anglia (UEA) paid out over £140,000 compensation to students affected by a 2017 data breach. An email containing information on personal issues, health problems and circumstances such as bereavement, was mistakenly sent to 300 UEA students. The email contained sensitive personal data of over 190 people. UEA reported that the email was mistakenly sent to a group email address, that autofilled when the sender started typing. This very simple mistake had a severe impact on hundreds of already vulnerable students.

This is an all too common example of how a simple slip can have a major impact. As Outsourced DPO for many schools and education institutions, we’ve seen just how often these mistakes are made. A misdirected email is one of the most frequent breaches logged by our customers. Although these breaches are prevalent, mismanagement of communication can have a devastating impact on an organisation.

What’s more, there are additional risks in Higher and Further education. Unlike trusts, student information is often held centrally, rather than in separate faculties or sites. Universities such as the University of London, The Open University and The University of Manchester, all have over 100,000 students on their enrolment list. When mistakes happen — and they will happen — a lot of people could be affected.

As this data breach occurred in 2017, the General Data Protection Regulations were not yet in effect. Should the Information Commissioner’s Office (ICO) have decided to fine the University, the amount would have been far lower than if the breach occurred today. However, The ICO decided that no punishment was required. Yet, while the regulatory consequences were low, the University was not absolved of their responsibility to the affected students.

As compensation for the damage from the breach, UEA paid a large sum to all affected students. The breach also damaged UEA’s reputation. Prospective students and their parents may worry about attending a university that is known for leaking personal data. Due to the high levels of media coverage, this breach could affect UEA’s admission rates for quite some time.

How can this be prevented?

The key tool here is knowledge. Providing training on good data protection practice can help staff stay on top of breach risks. The more aware they are, the more mindful they will be when handling personal data. Training can also help staff put their own preventative measures into place. For example, adding a ‘grace period’ to your email account. This puts a delay on your sent emails, allowing you to cancel them if you realise they are being sent to the wrong person, or hold the wrong information.

Other actions include adding the email recipient last, after writing the email and checking attachments. This gives the staff chance to double check emails and stops incomplete messages from being sent. A motto we encourage among our customers is “check twice, send once.” Taking the time to review a piece of work, saves time and stress in the long run.

What about when breaches can’t be prevented?

It’s important to note that breaches will happen. However hard your organisation works at prevention, mistakes happen to the best of us. Therefore, it is important to put a plan in place to mitigate the effects of a breach. Having a clear pathway for recording and managing breaches can ensure an issue doesn’t spiral into a reportable offence. Once again, knowledge is key. Make sure that members of your organisation know how to recognise a breach, and they know exactly who to tell. When you discover a breach, the clock is ticking. Not only do you have just 72 hours to report major breaches to the ICO, but the faster you act on a breach, the more effectively you can contain the impact. To avoid compensation pay-outs, a quick and efficient management process is vital.

 

 

The GDPR may feel like a hassle to many, but for all the students who entrust their data to educational institutions, following the regulations is integral to their wellbeing. The University of East Anglia has reminded us just how important data protection is. Effective management, good training, and staff awareness are all incredibly important.

Last week’s post briefly touched on how technological advances are providing new data protection challenges. Earlier this month, the 2020 Consumer Electronics Show (CES) showed the world the new smart devices we can expect to see filtering through the markets soon.

There were many companion robots on show. For instance, Bellabot (a robot cat waiter that purrs If you stroke its ears), and Lovot (a cuddly robot dedicated to imitating the behaviour of a living being). However, it is the ‘smart home’ developments seen at CES that set off the data protection alarm bells.

Take PantryOn’s smart shelves. A diligent individual with a bag of receipts could potentially track your purchase history, but it would be a lot of work. With automated shopping lists that update when your pantry is looking empty, all your purchasing behaviour is neatly held in one place. This just adds to the vast array of personal data that is only one breach away from being public knowledge.

The details of our grocery shopping are probably not high-risk personal data, but the data processed by Yukai’s Bocco robots is more worrying. Marketed as both a children’s companion and smart device, the Bocco robot can send and receive messages, and pair with sensors to share when a family member arrives home, what the weather is doing, and whether the front door has been closed correctly. The Bot also gets ‘excited’ when a family member’s birthday is nearing.

Personal details, correspondence, dates of birth, location data and access to smart devices. Yukai’s Bocco Bot can create quite a detailed picture of your habits. If this data set is compromised, it might offer an uncomfortably close view of your family life to the rest of the world.

What this means for education:

Its unlikely your budget will have space for personal robot companions, but smart technology is already in the classroom. Lecture recording products such as Echo 360 and PanOpto are regularly found in universities. Schools are using video based CPD software to help with their teacher training. The question of who owns the personal data these videos represent, has already created issues where teaching staff have wanted to take materials from one school to another.

Artificial Intelligence was previously a science fiction fantasy. Now it is being put into classroom management software. AI is now suggesting the best seating plan based on recorded information about students strengths, weaknesses and behaviour.

These developments in technology increase the volume of personal data being processed. It also brings in the potential for automated decision making. There are concerns about the systems being compromised or unavailable, but the largest issue is the increased complexity of subject access requests. Footage or audio recordings made by staff will hold personal data about students and this data is time consuming to extract and redact.

What it means for you:

We need to reach back briefly to the fundamentals of data protection. Personal data can be processed providing there is a lawful basis and that only data strictly necessary for the task is processed. Much new technology is designed to support teaching and learning by processing personal data. You may have to demonstrate that using that personal data is worth it for the educational benefit.

The key tool here is the Data Protection Impact Assessment (DPIA). The DPIA is designed to evaluate the risks to personal data and the benefits of a new initiative. It also looks at how the risks from an initiative can be mitigated. The process requires input from many sources and is one of the tasks that your data protection officer (DPO) is required to contribute to.

It is possible that the outcome of a DPIA will be that your proposed project does not go forward. If the project puts personal data at risk, and provides little benefit, it might be best to stick with your current method. However, that is an uncommon outcome. What you will have is the confidence that as the smart classroom develops, your data protection will develop with it.

We don’t usually comment on cybersecurity stories but the breaking news of the issues at Travelex (as reported by the BBCmade me think about the potential loss of access to critical information in an educational setting. 

From the information available ransomware has been placed in the Travelex system, forcing the company to shut down its’ online currency exchange service and revert to paper and pen across all of its’ sites. 

Ransomware attacks form a relatively small percentage of breaches reported to the ICO (39 in in the second quarter of 2019/20), but the education sector accounts for almost 20% of those reports. 

Security provider Palo Alto, in its’ Cyberpedia resource, lists the three most common attack methods for ransomware as visiting compromised websites, malicious email attachments and malicious email links. Many of you will have received one of these emails, which have become increasingly credible in recent months. 

If you do get caught by an attack there are various companies who say that they can help you recover without paying the ransom. However, it’s not uncommon for people to pay what’s demanded. 

Ransomware in the Education Setting

In a school situation, a locked-up machine may restrict much more than access to your favourite games and websites. Many people hold large amounts of data on local drives, much of it including personal data and sometimes of a highly confidential nature. More recent attacks have also been shown to steal data as well as encrypt the files on the machine. Even if the ransom is paid, the data may have already been retrieved. Imagine all that information about staff or students being published online for anyone to see. 

From a broader data protection perspective, the first question to ask is: Have you employed data protection by design and default? We know that data on a PC is vulnerable to this type of attack. If it’s personal data, what steps can be taken to remove the risk? 

How to Protect Against Availability Breaches

Moving data off the local machine and onto some form of network storage is the obvious answer. Whether that’s in a local server or in the cloud, if the machine is compromised, moving away from local storage provides an additional level of protection. 

This also solves the problems created by the use of memory sticks, an area of reportable breaches where education leads the field. If data is available online, then even if you move from location to location, it’s there for you when you need it. 

It’s also worth considering whether some of the personal data should ever be leaving core systems. Most people download information because they want to manipulate it, but often this analysis can be done in the system itself.  

Everyone in an organisation needs to be aware of the risk of clicking on links and attachments if they’re not sure of the origin. A golden rule is, if you’re not expecting an email think carefully before acting on it. If in doubt go the supposed sender (through a different contact route) and see if it was intended. 

It’s worth remembering that computers can be lost, stolen or simply break down. It’s not only ransomware that can make the data on them unavailable. Good data protection thinking means that even if you can’t use a particular device, critical data should always be available to the people who need it. 

The UK government has not had a fantastic start to the year. The New Year’s honours list, a list of individuals receiving awards on New Year’s Day, was mistakenly posted with personal contact details of over a thousand people. While the document was only available for around an hour, many notable—and often controversial—figures had their full addresses listed. The singer Elton John, baker Nadiya Hussain, and former director of public prosecutions Alison Saunders, were all included in the breach. Starting the new decade in the Information Commissioner’s Office’s doghouse, The Government is already playing catch-up.

However, they won’t be the only one. Without strong data protection policies and practices, breaches are inevitable. So, while diets and fitness plans may have already bit the dust, building a strong framework for GDPR compliance should be a New Year’s resolution that lasts.

Resolution 1: Perfect your data mapping

Data Mapping has been a curse for administrative staff across the EU. Yet, the benefits of keeping accurate records could not be clearer. Data mapping is a requirement within the GDPR, but it also comes in handy in the event of a Subject Access Request or breach. For instance, if a fire occurred in the archive room, a record of all documents held in the archive room will help with recovery.

Knowing exactly where all your data is held can reduce the strain when a problem occurs. As a New Year’s resolution, precise data mapping is a must-have.

Resolution 2: Learn to Recognise a Breach

The breaches seen most on the news are caused by cyber-hacks, or ransomware. Incidents such as the Travelex ransomware debacle often make the headlines. However, breaches caused by human error are much more likely, and usually are a lot harder to spot. Learning to recognise possible breaches quickly, means you can manage and mitigate them before they cause a problem. Issues such as missing files or incorrect information, are often ignored. and are often left unreported by staff, due to a fear of being reprimanded. Without a positive culture around data protection, it’s likely you’ll end up dealing with more serious consequences from breaches.

When moving your organisation forward this year, encouraging an atmosphere where staff feel able to speak up, should be a priority.

Resolution 3: Plan. Plan. Plan.

When an organisation discovers a breach, a ticking timer starts. If a breach is serious and needs to be reported to the Information Commissioner’s Office (ICO), it must be reported within 72 hours of discovery. This includes weekends and bank holidays. When a breach occurs, it is vital to have a tried and tested plan in place.

This year, make sure all your breaches are managed smoothly. Establish clear steps to report the breach internally, gather detailed information, and judge whether the breach is serious enough to be reported to the ICO. Ideas such as a specified email address for reporting breaches, and a designated team for managing them, can help your organisation have a stress-free 2020.

 

Resolution 4: Data Protection by Design

The final New Year’s resolution is about forward thinking. We must think about protecting personal data in the future, as well as right now. Advances in technology mean that organisations collect more and more personal data as we go about our days. For instance, Samsung’s Ballie Bot debuted this week. A tennis ball sized robot which follows its owner around. It captures ‘special moments’ via camera, and assists with personal fitness and household chores. While most organisations won’t be using miniature robots any time soon, new processes and technology can be expected over the 2020’s.

As we move into a new age, we must strive to perfect our GDPR compliance in the present, and design our advances with the protection of personal data at the forefront of our minds.

 

The time has come. Tinsel is up, chestnuts are roasting, and Santa is preparing his “Naughty or Nice list”. However, in this time of tradition, should we be thinking of the new data protection laws? Is St Nick in breach of the GDPR?

Well, he might be. Having a list of all the boys and girls is not as simple as it used to be. Many schools have ended up in a pickle when parents asked for a class list to help their children write Christmas cards. It seems a pointless worry; kids know all their classmates already, but in terms of the GDPR, a school should not provide that personal data without consent. Not only would it breach the GDPR, but confirming a child attends a certain school could put that child’s safety at risk. Either way, if you’re asked for a list of names in a class, the answer will likely be no.

So, what about Santa? We know Santa has a list of every child in the world! In fact, as he sees when you are sleeping, and he knows when you’re awake, Santa is collecting personal data all the time. How do we know he’s holding our data responsibly? When he gives our wish lists to his elves, is he breaching our personal data?

That really depends on whether Santa is in scope of the GDPR. If Santa is an individual, who provides presents out of the goodness of his heart, then he doesn’t need to worry about our new data protection laws. However, if he and his helpers are an organisation, compliance should be at the top of his to-do list. He really should ask permission from us all to judge our worthiness for more than a lump of coal.

Unless of course, Santa is a public body. If we signed Santa’s role into law, Santa could perform his task for the good of the people. Much like schools, Santa would need a Data Protection Officer, regardless of the number of elves in his employ.  While he could try to justify his surveillance as a public task, he’d still need to record his data mapping. In fact, along with your wishlist, you could send a Subject Access Request up to the North Pole.

As for his list of all the girls and boys – if our “naughty or nice” rating is processed automatically, he should have explicit consent to do so! Even if Santa makes  all the decisions himself, a list of the name and address of every child in the world sounds like a massive breach risk.

 

However, as with everything in the GDPR, pragmatism is key. Similar to education institutions, it’s important to find a compromise where compliance is met, and the organisation can still run. In terms of Christmas cards and class lists, you can encourage children to write their own lists. There’s nothing wrong with an individual looking around and writing down the names of the people in the room.

Where Santa is concerned, he appears to be an ageless and all-knowing creature with the ability to travel faster than light. We can probably trust him with our personal data. Although, if Santa is keen on staying compliant, keep an eye out for a privacy notice flying down the chimney soon.

A report published last week by career focused social network LinkedIn, identified the “Emerging Jobs” of 2020 in the UK. The report, which can be found here, looks at the roles that are experiencing significant growth.

At number one is “Artificial Intelligence Specialist”, confirming that this field is expanding out of the academic realm and into the mainstream. At number two, having experienced a 24-fold increase since 2015 is the role of Data Protection Officer.

Of course, one of the subjects exercising the minds of these new Data Protection Officers is the work of Artificial Intelligence Specialists. In the Education sector, in addition to being a subject of research and development, there is a great deal of interest in whether AI can lead a revolution in tailored teaching and learning.

Last year, alongside the GDPR, the EU produced a well-considered but concerning report called The Impact of Artificial Intelligence on Learning, Teaching, and Education. The report concluded there was great potential for AI in education, but that potential brought with it some significant risks.

We’re not talking about machines with human like intelligence just yet. Instead, the application of machine learning, grounded in the massive amount of data generated on a continuous basis. By 2025 the amount of data generated globally is estimated to reach 175 zettabytes – enough such that at standard internet speeds it would take nearly 2 billion years to download it all! (IDC, 2018).

There are some questions about this AI approach. Firstly, the principle of machine learning is that the algorithms are based on historical data. This means that the criteria for measuring success may be based on data with inherent bias. It’s also true that the past is not necessarily a good guide to the future. Such techniques don’t cope well with creativity and innovation.

But perhaps a greater concern is the fact that AI driven systems will find employment not just in teaching but in assessment. ‘Big Data’ driven machine learning can create highly detailed categories of outcomes and place people into them on the basis of their behaviour. The scope of data used in these algorithms would also be much larger than current techniques. There are suggestions that in addition to student responses, video footage could also be used to judge their level of engagement and behaviour.

What can clearly be seen at the end of this process, is the possibility of classroom technology that would manage both delivery of material and assessments of progress. It’s easy to see how this would be attractive across many settings, including the growing world of online learning.

How this Links to Data Protection

Coming back to the number 2 emerging role, the Data Protection Officer, these types of AI proposals create some major challenges. The Information Commissioner’s Office has an open Consultation on the use of AI in processes that make decisions about individuals. One significant question often raised; is how you explain to data subjects the decision-making process of your AI driven system.

Although the processes are based on algorithms, the sheer number of factors taken into account mean it can be hard for anyone, other than the AI specialists, to understand how the decisions are made. As a data controller, if you’re using such a system, you need to be able to explain how it works in a straightforward enough way for your data subjects to understand.

There are suggestions that the data about a students’ attendance, achievement, behaviour and progress could also be analysed to provide clues that there may be other issues in their lives affecting their ability to learn. This could include problems at home or other personal issues. While this could help identify the need for lifesaving support and prevention methods, there could be significant ramifications from a data protection standpoint.

As we look forward to a new decade it’s essential we recognise that the full implications of data driven education technology.

The main vehicle for this is the data protection impact assessment (DPIA). Alongside the standard considerations of the way that data is transferred, evaluated and managed, the DPIA needs to take a broader view of the initiative.

It’s worth remembering that just because a system can perform a particular function, it doesn’t mean that you must use that capability. Decisions made about the path any student should follow can have life changing consequences. The prospect that these decisions being made in part by algorithms that are difficult to explain should give us all some pause for thought.

As AI begins to appear in the mainstream, it is pleasing to see a rise in Data Protection Officers. So, we welcome the top two emerging roles and can see a long future of discussion and debate about pragmatic use of AI in education.