A report published last week by career focused social network LinkedIn, identified the “Emerging Jobs” of 2020 in the UK. The report, which can be found here, looks at the roles that are experiencing significant growth.
At number one is “Artificial Intelligence Specialist”, confirming that this field is expanding out of the academic realm and into the mainstream. At number two, having experienced a 24-fold increase since 2015 is the role of Data Protection Officer.
Of course, one of the subjects exercising the minds of these new Data Protection Officers is the work of Artificial Intelligence Specialists. In the Education sector, in addition to being a subject of research and development, there is a great deal of interest in whether AI can lead a revolution in tailored teaching and learning.
Last year, alongside the GDPR, the EU produced a well-considered but concerning report called The Impact of Artificial Intelligence on Learning, Teaching, and Education. The report concluded there was great potential for AI in education, but that potential brought with it some significant risks.
We’re not talking about machines with human like intelligence just yet. Instead, the application of machine learning, grounded in the massive amount of data generated on a continuous basis. By 2025 the amount of data generated globally is estimated to reach 175 zettabytes – enough such that at standard internet speeds it would take nearly 2 billion years to download it all! (IDC, 2018).
There are some questions about this AI approach. Firstly, the principle of machine learning is that the algorithms are based on historical data. This means that the criteria for measuring success may be based on data with inherent bias. It’s also true that the past is not necessarily a good guide to the future. Such techniques don’t cope well with creativity and innovation.
But perhaps a greater concern is the fact that AI driven systems will find employment not just in teaching but in assessment. ‘Big Data’ driven machine learning can create highly detailed categories of outcomes and place people into them on the basis of their behaviour. The scope of data used in these algorithms would also be much larger than current techniques. There are suggestions that in addition to student responses, video footage could also be used to judge their level of engagement and behaviour.
What can clearly be seen at the end of this process, is the possibility of classroom technology that would manage both delivery of material and assessments of progress. It’s easy to see how this would be attractive across many settings, including the growing world of online learning.
How this Links to Data Protection
Coming back to the number 2 emerging role, the Data Protection Officer, these types of AI proposals create some major challenges. The Information Commissioner’s Office has an open Consultation on the use of AI in processes that make decisions about individuals. One significant question often raised; is how you explain to data subjects the decision-making process of your AI driven system.
Although the processes are based on algorithms, the sheer number of factors taken into account mean it can be hard for anyone, other than the AI specialists, to understand how the decisions are made. As a data controller, if you’re using such a system, you need to be able to explain how it works in a straightforward enough way for your data subjects to understand.
There are suggestions that the data about a students’ attendance, achievement, behaviour and progress could also be analysed to provide clues that there may be other issues in their lives affecting their ability to learn. This could include problems at home or other personal issues. While this could help identify the need for lifesaving support and prevention methods, there could be significant ramifications from a data protection standpoint.
As we look forward to a new decade it’s essential we recognise that the full implications of data driven education technology.
The main vehicle for this is the data protection impact assessment (DPIA). Alongside the standard considerations of the way that data is transferred, evaluated and managed, the DPIA needs to take a broader view of the initiative.
It’s worth remembering that just because a system can perform a particular function, it doesn’t mean that you must use that capability. Decisions made about the path any student should follow can have life changing consequences. The prospect that these decisions being made in part by algorithms that are difficult to explain should give us all some pause for thought.
As AI begins to appear in the mainstream, it is pleasing to see a rise in Data Protection Officers. So, we welcome the top two emerging roles and can see a long future of discussion and debate about pragmatic use of AI in education.
Trackbacks & Pingbacks
[…] touched on the increased use of AI in education in an article last year. Simple algorithms are already used to mark work in online learning platforms. Other systems can […]
Comments are closed.