Not-so-critical AI literacy

Much of the ‘critical’ academic discourse on ethics and accessibility in relation to new technology does not eat its own dog food. It doesn’t bother to take the time to carefully define, categorise and evaluate particular technologies, and it mostly fails to propose credible actions or mitigations.

Please mind the gap between higher education and the technology ecosystem

Higher ed has considerable power to positively influence the tech ecosystem, not by moaning about capitalism, or waiting to see what happens. We have to do it by quickly identifying and aggressively advancing innovative and progressive use cases. So when do we start?

Reimagining learning technology futures with speculative design

The purpose of these approaches is not to predict a single future, but to imagine, experience and feel multiple possible futures, and to discuss and debate what we want our futures to be like, and what we don’t want them to be. 

EU removes ‘emotional robots’ from the classroom

The new EU AI Act bans ‘unacceptably risky’ AI applications, including emotion-aware systems which educational researchers believe can significantly improve learning in the classroom. Is the EU right to outlaw emotional AI for learning along with such nefarious uses as social scoring and behavioural manipulation? And with the diffusion of emotional AI in mainstream consumer products including virtual reality headsets, might the ban be a backward step for European learners?

Week notes: 19 June 2023

This week I am thinking about the innovation that isn’t. The innovation that doesn’t happen. I don’t mean innovation that fails. I mean innovation that simply doesn’t occur. The innovation that isn’t. Might machine and social learning reverse the stagnation of ideas?