Assistive Technology, AI, and the Future of SEND: Opportunity or Oversight?
Artificial Intelligence (AI) and Assistive Technology (AT) are often described as the next frontier in education. From text-to-speech tools to adaptive platforms, the promise is clear: technology can remove barriers, personalise learning, and support independence. For learners with Special Educational Needs and Disabilities (SEND), this potential is especially powerful.
But as the pace of innovation accelerates, so too must our critical questions. Are we embedding these tools ethically, inclusively, and equitably? Or are we at risk of deepening the very gaps we aim to close?
The Longstanding Power of Assistive Tech
Assistive technology has long been a lifeline for students with SEND. Simple adjustments — such as screen readers, captioning, or communication devices — can not only transform a child’s access to learning, it can change their life. When thoughtfully embedded, AT promotes autonomy, reduces stigma, and supports students to engage on their own terms - whilst at school and beyond.
Yet too often, I see these tools remain underused or inconsistently applied. Many families and practitioners know of transformative technologies, but face barriers in training, cost, or school leadership buy-in.
As one student explained: “I have a disability but I don’t tell most teachers because I don’t think they’d know how to help me.” This highlights the gap between available tools and the confidence of staff to use them.
AI Arrives in the Classroom
AI takes this further. Adaptive learning platforms promise to identify gaps in real time, providing tailored resources for individual learners. Predictive analytics can flag early warning signs of disengagement or wellbeing concerns. In theory, this represents the holy grail of personalised education.
But there are significant risks if we do not proceed carefully. Algorithms are only as good as the data that trains them. If SEND learners are underrepresented in the datasets driving these tools, AI may misinterpret their needs or reinforce stereotypes.
As one student put it: “People like me aren’t really seen as smart. Even when I do well, teachers act surprised.” This voice is a powerful reminder that bias exists not only in human systems, but also in the assumptions we build into technology.
Policy Context: SEND and Accountability
This debate is not happening in a vacuum. The Education Select Committee’s SEND review has made clear the systemic failings in provision, from patchy access to services to inequitable outcomes. At the same time, Ofsted’s evolving framework has placed inclusion at the heart of inspection. Leaders are now expected to demonstrate how their schools ensure equitable access, participation, and belonging for learners with SEND.
Against this backdrop, the arrival of AI and the expansion of AT represent both opportunity and risk. Leaders will be held accountable not only for whether these tools exist in their schools, but for whether they are truly reducing the SEND gap.
Ethical and Practical Considerations
So what does responsible use of AT and AI look like in practice?
- Equitable Access
Schools must guard against creating a two-tier system where only well-funded institutions can afford cutting-edge tools. - Teacher Training
No technology succeeds in isolation. Teachers need high-quality CPD to integrate AT and AI into pedagogy. - Student Voice and Co-Design
SEND learners must be part of the design and evaluation process. Too often, technology is ‘done to’ them, rather than developed with their input. - Ethical Safeguards
Data privacy, bias in algorithms, and the risk of over-surveillance are critical concerns. Schools must ensure that tools support students rather than reduce them to data points.
A Leadership Challenge
For school and trust leaders, the task is both strategic and moral. AI and assistive technology should not be viewed as quick fixes, but as part of a broader culture of intentional inclusion. Leaders must be prepared to interrogate the ethics, invest in training, and engage with students and families to ensure these tools serve their intended purpose.
Conclusion
Assistive technology and AI hold extraordinary potential for SEND learners. Done well, they can empower students, close gaps, and support schools in meeting the high bar set by Ofsted’s focus on inclusion. Done poorly, they risk widening inequities and reinforcing barriers.
The question for leaders is not whether to adopt these tools, but how to do so with ethics, equity, and intentional inclusion at the centre. The future of SEND education will be shaped not by the technology itself, but by the choices we make about how we use it.