Accessibility Tools
Our current university system is not systematically meeting the needs of the neurodivergent community, write Monique Mackenzie and Elliott Spaeth, who highlight artificial intelligence’s potential for education that serves all learners.
As much as 20% of the population is estimated to be neurodivergent – a community described by a range of terms such as autism, attention deficit hyperactivity disorder (ADHD), dyslexia, dyscalculia and dyspraxia.
Despite this prevalence, neurodivergent students do not experience university as successfully as their neurotypical peers, since they tend to access programmes which underutilise their abilities and learning potential. Moreover, they face a barrage of challenges during their studies that mean they are more likely to drop out, despite being academically capable and sometimes exceptional. Even after university, the number of autistic graduates who transition into graduate-level employment or continue to study is lower than any other ‘disabled’ group.
The punchy combination of packed lecture halls, noisy workshops, sudden changes and social masking, which is often required by our neurodivergent community to interact successfully with others, can be a mammoth task. Furthermore, success at meeting these challenges is not simply achieved by better self-awareness or time management skills; most neurodivergent students are acutely aware of how various activities affect their energy budgets, and where they can, they plan their time accordingly.
We all benefit when everyone can contribute to solving our wicked global challenges, regardless of their location on the cognitive continuum. However, our current university system is not systematically meeting the needs of our neurodivergent community. Providing tailored education in line with diverse needs can change this. In particular, artificial intelligence (AI) may help universities to foster more inclusive education to all learners.
AI presents a wonderful opportunity to provide all students, however marginalised, with a personalised education. Indeed, the EU’s recent AI Act notably makes reference to AI for access to university, personalised learning and assessment, although it deems these uses to be ‘high-risk’. While careful planning and critical assessment must precede the rollout of any AI tool, we would be doing our students a disservice if we do not make progress on this.
For instance, clever use of a virtual learning environment (to support in-person provision) can also provide additional content for students with evidenced knowledge gaps (based on poor performance in formative assessments) before progressing to successful summative assessment. This digital environment helps close knowledge gaps borne from an inadequate, or unrelated, preparatory education, especially for (neurodivergent) students who find interacting with instructors difficult and/or must tolerate noisy and brightly lit environments in order to attend classes.
This personalisation can also be underpinned by a no-detriment approach, where all students can benefit from recommendations about knowledge gaps or extension material but the full range of materials is available for all.
Outside of a personalised education, AI offers other advantages. For example, the use of AI-related tools in a students’ education helps to foster a ‘lived experience’ as preparation for the modern world of work; the World Economic Forum estimates that by 2025, AI is set to deliver 58 million job extra jobs and almost all will require AI-based experience.
AI-based virtual assistants can also benefit students who need to study outside of working hours and/or would not seek help in person. This round-the-clock assistance can help students navigate university life, from coursework to administrative tasks.
Large language models (LLMs) can assist students’ learning by organising learning materials in new ways and allow questions to be asked of the (pre-specified) materials. That said, they should be used with a great deal of caution; LLMs are trained to produce convincing looking text but are indifferent to the truth of their outputs. However, whether we like it or not, LLMs are broadly used; in the UK almost all (92%) students use AI in some form, e.g. to explain concepts and summarise articles. And almost 1 in 5 report having included AI-generated text in their work.
AI-assisted automated translations (in almost real-time) can also greatly improve understanding and participation for learners with different language backgrounds, who are usually underserved by traditional curriculums.
AI-based tools are coming onto the market, thick and fast, and some are even funded for students with educational needs, at least in the UK. With promises of audio summaries of papers to assist students with visual impairments or dyslexia, note-taking software which generates transcripts and summaries, tools to create presentations and even AI-based ‘coaching’ for effective delivery of these presentations, there is lots to explore and critically assess. Only by using these tools can we gain a grounded understanding of when they are useful, and crucially when they are not.
Contrary to these opportunities, there has been substantial media attention to the risks and required governance around responsible AI use. The EU’s AI Act also speaks in part to this. However, in our experience, universities are routinely excellent at ethics and governance. So, our main concern is that regulation will slow or even completely stall the development of personalisation in favour of ‘mass education’, which arguably serves no one well – particularly those who are already underrepresented.
And more personally from the authors – one a parent of a neurodivergent teenager in receipt of an online secondary school education with personalisation, the other a neurodivergent leader in inclusive practice in higher education – we wholeheartedly hope our university landscape considers this dedicated, often talented – and yet marginalised – community and explores where ‘AI in the loop’ can help deliver a more inclusive education.