Last week’s post briefly touched on how technological advances are providing new data protection challenges. Earlier this month, the 2020 Consumer Electronics Show (CES) showed the world the new smart devices we can expect to see filtering through the markets soon.

There were many companion robots on show. For instance, Bellabot (a robot cat waiter that purrs If you stroke its ears), and Lovot (a cuddly robot dedicated to imitating the behaviour of a living being). However, it is the ‘smart home’ developments seen at CES that set off the data protection alarm bells.

Take PantryOn’s smart shelves. A diligent individual with a bag of receipts could potentially track your purchase history, but it would be a lot of work. With automated shopping lists that update when your pantry is looking empty, all your purchasing behaviour is neatly held in one place. This just adds to the vast array of personal data that is only one breach away from being public knowledge.

The details of our grocery shopping are probably not high-risk personal data, but the data processed by Yukai’s Bocco robots is more worrying. Marketed as both a children’s companion and smart device, the Bocco robot can send and receive messages, and pair with sensors to share when a family member arrives home, what the weather is doing, and whether the front door has been closed correctly. The Bot also gets ‘excited’ when a family member’s birthday is nearing.

Personal details, correspondence, dates of birth, location data and access to smart devices. Yukai’s Bocco Bot can create quite a detailed picture of your habits. If this data set is compromised, it might offer an uncomfortably close view of your family life to the rest of the world.

What this means for education:

Its unlikely your budget will have space for personal robot companions, but smart technology is already in the classroom. Lecture recording products such as Echo 360 and PanOpto are regularly found in universities. Schools are using video based CPD software to help with their teacher training. The question of who owns the personal data these videos represent, has already created issues where teaching staff have wanted to take materials from one school to another.

Artificial Intelligence was previously a science fiction fantasy. Now it is being put into classroom management software. AI is now suggesting the best seating plan based on recorded information about students strengths, weaknesses and behaviour.

These developments in technology increase the volume of personal data being processed. It also brings in the potential for automated decision making. There are concerns about the systems being compromised or unavailable, but the largest issue is the increased complexity of subject access requests. Footage or audio recordings made by staff will hold personal data about students and this data is time consuming to extract and redact.

What it means for you:

We need to reach back briefly to the fundamentals of data protection. Personal data can be processed providing there is a lawful basis and that only data strictly necessary for the task is processed. Much new technology is designed to support teaching and learning by processing personal data. You may have to demonstrate that using that personal data is worth it for the educational benefit.

The key tool here is the Data Protection Impact Assessment (DPIA). The DPIA is designed to evaluate the risks to personal data and the benefits of a new initiative. It also looks at how the risks from an initiative can be mitigated. The process requires input from many sources and is one of the tasks that your data protection officer (DPO) is required to contribute to.

It is possible that the outcome of a DPIA will be that your proposed project does not go forward. If the project puts personal data at risk, and provides little benefit, it might be best to stick with your current method. However, that is an uncommon outcome. What you will have is the confidence that as the smart classroom develops, your data protection will develop with it.