EU removes ‘emotional robots’ from the classroom

(Picture credit: Sergio Martínez (2008), ‘Computer Vision’. Flickr: https://flic.kr/p/5Ga9NK (CC BY 2.0 DEED Attribution 2.0 Generic))

The new European Union AI Act bans ‘unacceptably risky’ AI applications, including emotion-aware systems which educational researchers believe can significantly improve learning. Is the EU right to outlaw emotional AI for learning along with such nefarious uses as social scoring and behavioural manipulation? And with the diffusion of emotional AI in mainstream consumer products including virtual reality headsets, might the ban be a backward step for European learners?

The EU AI Act, heralded as the world’s first legal framework for AI implementation, is expected to come into force in April 2024 (European Parliament, 2024). The act recognises certain AI-powered applications that present an ‘unacceptable’ level of risk to citizens’ rights (European Commission, 2024). These are: biometric categorisation systems based on sensitive characteristics; untargeted scraping of facial images from internet or CCTV footage to create facial recognition databases; social scoring; predictive policing; and AI systems that manipulate human behaviour or exploit individuals’ vulnerabilities.

These will be illegal in the EU from October 2024.

The list of outlawed AI applications also includes the use in educational institutions of AI systems able to recognise human emotions. State-of-the-art ‘emotional AI’ systems use machine learning algorithms to detect, recognise and even respond to human affective states and emotional responses. Try imagining an educational computer game that is able to recognise when a learner is confused or frustrated, and so adapt the learning experience according to the learner’s emotional state, to ensure the best outcomes.

Researchers believe that emotional AI systems can create personalised learning experiences, and assist with development of emotional and social learning. Enthusiasts say such systems might one day provide every student with an empathic, emotionally intelligent learning companion.

Soft systems

In lab tests, researchers have demonstrated that emotional AI applications offer many benefits in education, including improvements in learners’ attention and engagement, confidence, motivation, comprehension, information processing and recall (Leutner, 2014; Mayer, 2014; Sandanayake and Madurapperuma, 2013). However, critics of emotional AI point out that such benefits are dubious since they are based mostly on small-scale studies that have not been replicated outside the lab. Others question whether these systems really work at all, or are effective in recognising and categorising emotions (McStay, 2019). The empirical basis of the benefits claimed for emotional AI may be unsteady.

Emotional AI is not new. In fact, university researchers have been building emotional AI systems, and deploying them in the classroom, since the 1990s (Picard, 1995). What is new is the accuracy and power that machine learning and deep learning methods bring. Using unsupervised representation learning (a machine learning technique for identifying patterns in large volumes of data), it is now possible to create systems able to detect, classify and recognise emotions using raw (unlabelled) data from multiple sources including human facial expressions, voice, text and even physiological signals such as skin conductance and electrical activity of the heart (Schuller et al., 2021). Human emotions are now machine readable.

Rights and wrongs

Sounds creepy, right? Indeed, critics of emotional AI argue that these systems are problematic in terms of social values, ethics and the law. Use of emotional AI in educational settings may contravene the United Nations Convention on the Rights of the Child (OHCHR, 1990), including the child’s right to freedom of thought (Article 14) and privacy (Article 16). If human emotions are machine readable, the fear is that children’s emotional data will be exploited by commercial interests to develop profit-making systems for use in other contexts and for less wholesome purposes. Is this in the public good?

The passing of the EU AI Act ends the argument over emotional AI in education, by banning it in the 27 member states and others that align to the bloc. Or does it?

Pass or fail

For decades now, development of emotional AI systems in education has been in the hands of academic researchers and educators who we must assume are concerned with enabling all learners to reach their fullest potential (as they are beholden to do under Article 29 of the UN Convention). The EU AI Act will put an end to responsible development of emotional AI within European educational institutions, and may provide an incentive for commercial interests to market emotional-AI-powered apps to learners directly, outside the classroom. Emotion-sensitive systems are now converging with multi-modal machine learning models, and consumer hardware such as smart glasses, extended reality headsets, and other mobile and wearable AI-enabled devices that cost as little as 180 euros. The potential of these devices to create personalised, context-aware, adaptive, multi-sensory learning experiences is vast, and so too is the risk of harm from irresponsible and exploitative uses.

The EU AI Act may end up denying learners the benefits of responsible emotional AI systems while failing to protect them from the risks of commercialisation of this technology.

Time will tell if the act is a pass or fail for education in Europe.

References

European Commission (2024) ‘AI Act’. Available at: https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai (Accessed 24 March 2024).

European Parliament (2024) ‘Artificial Intelligence Act: MEPs adopt landmark law’. Available at: https://www.europarl.europa.eu/news/en/press-room/20240308IPR19015/artificial-intelligence-act-meps-adopt-landmark-law (Accessed 21 March 2024).

Leutner, D. (2014) ‘Motivation and emotion as mediators in multimedia learning’, Learning and Instruction, 29, pp. 174–75. Available at: https://doi.org/10.1016/j.learninstruc.2013.05.004 (Accessed 3 January 2024).

Mayer, R.E. (2014) ‘Incorporating motivation into multimedia learning’, Learning and Instruction, 29, pp. 171–73. Available at: https://doi.org/10.1016/j.learninstruc.2013.04.003 (Accessed 3 January 2024).

McStay, A. (2019) ‘Emotional AI and EdTech: serving the public good?’, Journal of Educational Media: The Journal of the Educational Television Association, 45(3), pp. 270–83. Available at: https://doi.org/10.1080/17439884.2020.1686016 (Accessed 21 March 2024).

Office of the United Nations High Commissioner for Human Rights (OHCHR) (1990) Convention on the Rights of the Child. Available at: https://www.ohchr.org/Documents/ProfessionalInterest/crc.pdf (Accessed 24 March 2024).

Picard, R. (1995) ‘Affective computing’, MIT Media Laboratory Perceptual Computing Section Technical Report No 321. Available at: https://hd.media.mit.edu/tech-reports/TR-469.pdf (Accessed 20 January 2024).

Sandanayake, T.C. and Madurapperuma, A.P. (2013) ‘Affective e-learning model for recognising learner emotions in online learning environment’, 2013 International Conference on Advances in ICT for Emerging Regions (ICTer), pp. 266–71. Available at: https://doi.org/10.1109/ICTer.2013.6761189 (Accessed 20 January 2024).

Schuller, B.W., Picard, R, André, E., Gratch, J. and Tao, J. (2021) ‘Intelligent signal processing for affective computing (from the guest editors)’, IEEE Signal Processing Magazine, 38(6), pp. 9–11. Available at: https://doi.org/10.1109/MSP.2021.3096415 (Accessed 7 February 2024).

Leave a comment