Algorithmic and AI risks in education

Algorithms and AI are used on an increasing scale in the education sector, both in primary and secondary education and in higher education and vocational training. Educational institutions are, for example, deploying adaptive learning systems (AL systems) with the intention to automate their offering of customised education. We are seeing increased deployment of learning analytics (LA systems) in education, which educational institutions are using to improve the progression of students or the quality of education.

On this page

Challenges and risks 

Responsible deployment of algorithms and AI in education is crucial, as is knowledge of the limitations. After all, education is all about the development of children and young adults. Here are a few examples of challenges and risks identified by the fall 2023 edition of the Dutch Data Protection Authority’s AI & Algorithmic Risks Report Netherlands (ARR): 

  • Adaptive learning systems struggle to identify the learning needs of pupils and students who master the learning material less well than pupils and students who have a better grasp of the material. Adaptive learning may work better for some groups of pupils and students than for others. This constitutes a risk. A human teacher knows more about a child, such as their home situation or attention span, than an adaptive learning system can know. It is important, therefore, that teachers and AL systems continue to complement each other. 
  • During their training, teachers learn to assess the quality and suitability of teaching materials. They become very skilled at that when it comes to ‘traditional’ teaching materials such as textbooks and workbooks. However, when teachers have to assess the usefulness of algorithmic teaching materials, such as adaptive learning systems, they need a very different kind of knowledge. The question is whether teachers currently have that knowledge.
  • Educational institutions often see adaptive learning systems merely as a means to free up time that teachers can then devote to other tasks, but that is overly simplistic. Teachers need to be able to, for example, intervene when a learning system makes the wrong choices. Teachers will, therefore, still have to stay focused on the group no matter what, albeit now through the learning system. This also takes up their time. Besides, teachers need to have some understanding of how the adaptive learning system works to be able to intervene. As a result, using an adaptive learning system will not free up all the time that the teacher used to spend on exercises. What happens is that teachers’ role changes. In their new role, teachers still have to invest time.
  • The deployment of learning analytics to improve individual learning outcomes requires the profiling of students and, therefore, involves the processing of their personal data. This always means a breach of their privacy. The deployment of LA systems without a clear purpose and without clearly defined frameworks hugely increases the chance of unlawful and irresponsible use.  
  • Negligent deployment of LA systems increases the chances of students being treated unfairly. Educational institutions using data and profiles without certainty as to what the data does and does not mean and what conclusions can be drawn from it may lead to unfair evaluations.  
  • In secondary education, students are using generative AI for their homework assignments. Institutions in the vocational training domain and in higher education still barely have policy in place on what their students are and are not allowed to use generative AI for. This also leads to a risk of plagiarism and misinformation. See also: Generative AI risks.

What is needed?

The Dutch Data Protection Authority’s advice to educational institutions is to, among other things, include the deployment of AI in their IT strategy and to ensure adequate support from internal and external experts. Another focus point is to grow knowledge of AI among teachers.

To find out more about the algorithmic and AI risks in education identified by the Dutch Data Protection Authority (AP), read Chapter 4 of the fall 2023 edition of the ARR. 

Want to know more?

Curious about the full information about this subject? Take a look at Chapter 3 of the fall 2023 edition of the AI & Algorithmic Risks Report Netherlands (ARR). The Dutch Data Protection Authority (AP) publishes an ARR twice a year and addresses several key risks on its website.