Skip to content
All posts

Let’s Get Critical – Teaching Students and Educators to Challenge AI

This is a recording of the EdTech Innovate session, "Let’s Get Critical – Teaching Students and Educators to Challenge AI", which took place at the Schools & Academies Show London on 1st May 2024. 

In this session, our speakers discuss the responsible and ethical deployment of AI in education, focusing on striking a balance between harnessing technology's power and preparing students for the future. They highlight the crucial role teachers play in helping students effectively utilise AI, as well as the need for teacher training and age-appropriate exposure. The speakers also emphasise the limitations of current policy and infrastructure, along with the need for good role models and teaching methods to engage students with AI.

The discussion was focused on the following points:

  • With the influx of AI in education, how can we empower staff and pupils to critically question and engage with the information presented by AI?
  • Guiding educators on how to integrate AI literacy into their curriculum
  • Recognising the importance of teaching staff and students to critically engage with AI technologies
  • Teaching students to approach AI with a critical and informed mindset

Panelists include:

  • Shahneila Saeed, Head of Education, ukie.org.uk
  • Chris Goodall, Head of Digital Education, Bourne Education Trust
  • Sir Anthony Seldon, Head Teacher, Epsom College

You can watch the panel discussion down below for free!

 

REGISTER FREE TODAY

 

Transcript

Shahneila Saeed  00:52
Thank you. So I would like to have wonderful discussion. I'd like to introduce our two speakers. So first, Chris Goodall, Head of Digital Education for Bourne Education Trust and Sir. Anthony Seldon, Head at Epsom College and AI and Education Initiative. Thank you very much for joining me today on this stage. Thank you. This artificial intelligence has already sort of reared its head as a point for discussion in earlier conversations taking place earlier this morning, and I'm sure in some of the later discussions and panel discussions, it will be mentioned again. So I think it's so very poignant and sort of quite apt that you're both here to talk to us about your work and for us to be able to facilitate this discussion on AI. If it's okay with yourselves, I'm going to dive straight in with some direct questions. And my first question, Chris, I'd like to ask to you, in your opinion, what are the most significant changes you've observed in schools since the integration of AI technologies, and especially within your trust of schools? 

Chris Goodall  02:44
I guess the first thing I've noticed is how wide the divide is across the whole country, and in fact, across the world. We have got schools that are absolutely flying ahead with this, and some would argue, quite too fast, introducing things that perhaps they're not thinking about clearly, all the way to the other end, which is people who are banning this, trying to block it and keep it out of schools.  And that divide's only going to get bigger and bigger. And I think part of the reason for that is, obviously there's not a lot of training out there. But I think if you're lucky enough, it sort of happens randomly. So if you're lucky enough to have someone who's quite passionate about tech in school, but they've also got some leadership, sort of clout, and also they can then talk the language of human rather than tech. If you've got those three things, you tend to have someone in school that will drive things forward. But that very, very much happens at random. Not all schools have lucky to do that. When chatgpt came around, we trained all our staff. I picked it up in December, I think of 2022. I thought it'd be really useful for me as a teacher, so I used it in my own teaching. Absolutely in the first week back after Christmas, thought this is going to be really helpful for teachers. So we trained our staff, our TAs really early, and we were in that school. I'm not now at that school. I trained staff across the trust, but what we're seeing is not embedded use, because I don't believe you got long enough to embed it at this stage, but after a year and a bit of teachers and staff using it, actually AI, becomes invisible. And people often ask, "Oh, can we come and see what you're doing?" You're not going to see anything. It's actually helping the mundane, the bits that teachers are really struggling with, making lessons, more engagement, differentiating for students, workload, it can help with that as well. So it's really invisible, embedded use, as opposed to being something much more obvious in the school. 

Shahneila Saeed  04:35
That's fascinating because I think sometimes you expect to see something sort of quite glitzy and buzzy, but I think what you've mentioned there is very interesting about it actually becoming quite invisible. And maybe that's where this is going. So there's a lot of points there, which, Sir Anthony, I'd like to sort of pick up with you next. What skills do you think are essential for students to have in an AI augmented future? And in your opinion, how can educators help students acquire these skills?

Sir Anthony Seldon  05:02
So can I just ask a question, who here thinks sitting down on standing ground thinks that AI is a really big deal in education? And does anyone here think that it is simply the biggest thing in education now? Hands up. Okay, what's bigger? 

Audience Member 1  05:14
Teaching. 

Sir Anthony Seldon  05:20

Well, okay, So teaching. So we all think it's a very big thing. Who here is part of a school or a trust, which you're confident that you are on top of it, and you feel that you know what you need to know about it. Who here feels confident of their school or trust? Ok. So I think depending how we measure it, teaching is obviously bigger, and teachers and students are obviously bigger than AI, but is the biggest news thing. I mean, teachers have been around for a long time. It's a massive thing that will transform every aspect of education. And so what are the skills that we need to have? And by the way, that's slightly frightening if you do think, as you do, that your schools and multi category trusts are not sufficiently on top of it, and because if you're not, it's going to be harder and harder to catch up. So this is why the skills, why we founded something called "AI in education", which is a charity. It's a non profit making body of the profession, teaching profession by teachers, for teachers and for students and for others. We are there because pre AI technology has eaten schools and teachers and kids for lunch, and we have to, we believe, get ahead of AI and shape it, so it's in the interest of young people, of learners everywhere, interests of schools and especially the most disadvantaged. And what are the skills that we need? Human skills, for goodness sake, human skills are not taught sufficiently in schools at the moment. We value people purely by their ability, or almost purely by their ability to do well exams and it fails 1/3 of young people, as you know, 1/3 of those who are already most disadvantaged. We need to prioritise human skills, and we need to find out what each and every young person can do and what they are passionate about, what they love, what they care about. And we need to put children at the very heart, because at the moment, they are being measured against a totally 20th century system of examinations, and it's doing their heads in. It's making them mentally unwell. And you know what? These are precisely the skills that the algorithms will always be better at. And we are not teaching the things that the algorithms, the AI will never be better at, which is to be human beings. We've forgotten. We don't do enough about it. So human skills in answer to the question, thank you.


Shahneila Saeed  08:22
Thank you. Thank you. Absolutely, I think, I don't think anyone will disagree with you here. And actually, we also have another panel this afternoon, which will be diving deeper on that topic around assessment, I think. And what you alluded to there about mentioning the changes that we've seen in assessment. I really love what you said about sort of getting ahead of the curve with AI. But you know, teachers are an important part of that equation, Chris. And in your opinion, I think it's inevitable, we need teacher training. What kind of training do you think teachers need to effectively guide students in the use of AI technologies? And how do you propose this training should be implemented, especially across your schools? 

Chris Goodall  08:22
I think this is one that people really struggle with, and there's a bit of an industry emerging of training courses, paid consultants. And actually, for me, and again, I'm not an expert, I'm a guy who's implemented it in schools, implemented it across a trust. So I'm learning as much as anybody else is learning. But what we've tended to focus on is a core, large language model. And interestingly enough, because everything moves quite quickly. When we first started, we were training on Chat GPT, the free version, but that now is the most least powerful version you can get of AI, so we've had to adjust our training process, but a large language model and a core, large language model. But what we're seeing is, again, the industry happening, is there's loads of what we call wrapper apps out there, being sold into schools of all sorts of nature. Some of them useful, some of them not so useful. But in terms of critically evaluating these things, I through playing with a large language model, I'm in a much better position to evaluate those apps because I know that some of them are just a pretty skin over the top of an app. I know that from knowing what I can do with a large language model. So we focus very much our training on large language models, not the wrapper apps, but people will use whatever they're using, but we want to help them use it more critically. In terms of the training session, I do a one off AI awareness presentation to the schools, and it gives them a very quick blast of use cases, some ethics, some data protection issues, generally, a picture of what's going on out there at the moment. I then follow that up, with not me actually, I get the schools to come up with a problem. So it could be a problem that the school's actually facing, not an AI problem, such as they want to adapt their curriculum to more inquiry based curriculum, or that they want to develop writing in the curriculum, and we are show them very at the start a very quick process, how they can use an LLM to solve that school problem. And then the teachers go off and they continue to solve that problem in the training session. They learn by doing, and that's the most important thing, is giving teachers time to actually do and learn how it works.

Sir Anthony Seldon  08:22
So look at the AI in education website, and you get Chris Free. I didn't like about talking is it didn't get much interaction, like any interaction. But Chris free cannot be a bad thing. We are there AI in education. It's of the profession for the profession, by the profession, in the interest of all learners, especially the most vulnerable. And it came out of a book I wrote eight years ago called the fourth education revolution, looking at how AI was going to totally transform education as it now is. We want to help. We want you part of it. Join us. Let's bend the technologies as they come on stream, in the interest of real education and helping young people learn. Was that the question? 

Shahneila Saeed  12:18
I hadn't asked you a question yet, but I love that. But actually I'm interested, because I know you have close ties with both industry and government. You work very closely there. And so I think, with Chris mentioning teachers and time, I think it's an age old equation that has never been resolved quite yet. Maybe AI can solve it. But you know when we think about things like that policy is so important. So how do you see policy evolving to keep up with the deployment of AI in schools? And are there any ethical considerations that we should be thinking about?

Sir Anthony Seldon  12:52
There's so much to say so let me just make 2 points. One is that we cannot look to government. So we, we cannot, all of us, 100+  people in this space and everywhere, we cannot look to government. I mean government and now there's a minister called Baroness, Baron she gets it. There's some great officials in DFE. They get it. It's taken years of me and other people pushing them and saying, "this massive thing is coming down the track at you get ahead of it. Don't wait." We cannot wait for government. We cannot wait for parliament to come up with safety regulation. We certainly can't wait for the tech companies, the bleeding heart tech companies, who only want our money and they do not care how they ravage the mental health of our young people, and degrade and devalue the teaching profession and all that wholesome and best about it. We have to get ahead of it. So I don't care about policy. I care about us getting ahead. This is a real revolt making certain in the most important new area. And the the other point that I would say, was the answer the question, which was about ethical consideration. Look, so Jonathan Haidt, who here is coming across Jonathan Haidt, and his attempt to try and ban these, who's come across this. So Jonathan Haidt, I mean, he's making many incredible points about the damage that will always happen to the young people, particularly at puberty, from people having machines too early, and the way that it's damaged their development of social interactions at key ages, and the way that it's depleted family conversation, and it's taken them away from play. And my goodness, that's so important, because play gives them danger. Young people need challenge and risk and a sense that they can negotiate themselves through it. I think he blames too much on the technology. But who thinks that we can ban these things and who thinks that we can really work them in our favor? In other words, who thinks this is like cigarettes, which really under or illegal drugs, never going to do anything then screw people up and damage their lungs? Who thinks it's like alcohol or that we can use, but we've got to use it carefully, right? Who here thinks that these things are like cigarettes and illegal drugs? Ban them all together? And good. That shows real, real survey, and who thinks that we have to learn how to bend them in the interest of learning? So Jonathan Haidt, very big with the right, very big with the right in America, very big, not just the right, very big with a lot of groups. So there we are. Stop.

Shahneila Saeed  13:09
No thank you. And then I think sort of bending these to our will, and teaching students sort of how to embrace that. It's all about striking that balance potentially.

Sir Anthony Seldon  16:11
And Chris is free. Who here would like to have a bit of free Chris? Anyone? I counted three hands.

Chris Goodall  16:23
We got to clarify the definition of "who wants a bit of free Chris?". 

Sir Anthony Seldon  16:28
He's amazing!

Shahneila Saeed  16:29
Inquiries at the end of the discussion, at the front of the stage. No, thank you. I think, I mean, there's a lot there when we start to talk about harnessing the power of the technology. That's something that's not going to happen overnight. It is going to take time. So I guess it will quite apt the next question to be, you know, this is for both of you, but Chris coming to you first, you know, how do you envision the role of AI and education involving over the next, say, 10 years or so? And you know, what steps can educators take now to prepare for these changes?

Chris Goodall  17:02
Well, I think, I mean, that's a massive question. I don't think anybody knows the answer to it. So anybody that says they know how this is going to play out, don't listen to them. But I think what's clear is this isn't actually AI in education, it's AI in society. You know, this is affecting every inch of our society. And covid hit, which did affect every aspect society. We didn't just put on one assembly or produce a six week course in year seven around it, or put it in our PSHE curriculum. We were talking about this constantly. And I think nobody knows the future. I view it myself like on a staircase, and I can't see those steps above me, I can only see the steps below me, and I'm trying to help as many people in our schools, particularly come up to the step that I'm at, which is the one I can see, and then I look up and wait for the next one to appear, so that we're not in a position where suddenly those steps are now in the clouds, and we've left people at the bottom. So I view it a bit like that. So we basically need to get these people using it, getting everybody using it, so that  we aren't left behind. And I think that's really important. I think long term, if you think about where we may be in the future, I do think I'm quite positive about AI. I think it can bring a lot of personalisation, but I wonder if it's going to go a bit like television. You've got the BBC that used to be centralised control, educate, inform and engage, and it was done by a centralised organization, where you had to watch at a certain time in a certain place, and you're told what to watch. We've now got Netflix, we've got BBC, iPlayer. People choose what they want to watch, when, where they watch, how they want to watch. And I wonder if education is going to go the same way, and we'll be left with the equivalent, maybe Edflix, possibly learning on demand, basically.

Shahneila Saeed  18:45
Learning on demand. Interesting future. Sir Anthony, same question to you.

Sir Anthony Seldon  18:50
So we don't know, as Chris says, but we know it's going to be very big, and we can't get hexed by the fact it's going to be so big and so extensive. So how long before we're going to be looking at our screens on glasses and talking and new material is going to come up on our glasses? Who thinks that's going to happen never? Hands up. Who thinks it will happen within five years? In five years? Ten years? Everything is going to change. Why does anyone need to go to school when they can do their math, they can do their English, they can do their lessons, and we have to rethink what schools are about. What do people need? They need human intelligence, social relationships. We go crazy if we don't. It's why people put in solitary confinement, and we put our children in solitary confinement by the pre AI addictive nature of the stuff. You wait till the AI technology starts messing with their brains, and they'll never be able to get off it, unless we get ahead. So we can't think "Oh, my God, this is so big I can't get involved with it. I don't want to know." We have to get involved and we have to shape it, because it's going to be so vast. As Stephen Hawking said shortly before he died, it will either be the best or the worst thing that's ever hit humanity, which leaves a bit of space in between. It's us. It's only us, not government, not Labour, not parliament of Labour assuming Labour win which they will, not parliament, not the tech companies. It's us. It's us. And you know, there are lots of great things out there. RAI in education, by the way, has the chairs of all the exam boards on it, a lot of great scientists, psychologists, thinkers, all with a common end to try and make this work in the interest of all young people and to help us through the damage that's already been done. 

Shahneila Saeed  21:00
We absolutely have to make this work. I have more questions, but I'm wondering, actually, I think there might be some questions for the audience, given the nature of this topic. Are there any questions from the audience that anyone would like to ask? So we have one up at the front. Is there a roaming mic that we can use? 

Sir Anthony Seldon  21:28
Someone wants to know, "Is Chris really free?" Yes, yes, he is! On the AI and Education website.

Audience Member 2  21:42
That's absolutely fascinating. I've just got a question around policy and what we can really offer in schools currently. So for me, I think there's a disjunct between the construct that we have to work within and the infrastructure around Progress 8, EBacc, Ofsted, all of those limiting factors in terms of what we can truly do to innovate and cultivate a climate where we are truly enabling our children to be at the forefront and to be really ready for this. Because I would completely agree, I think we've got to focus on those human skills, but equally, leading a Multi Academy trust within the infrastructure that is policy driven means that we can only do so much, and my frustration is, if that doesn't lift, we can't really get to the heart of these issues. So I would suggest that the policy is where we are considerably limited currently. That's just my opinion.

Sir Anthony Seldon  22:44
So that's not policy facilitating, it's policy limiting, and it's based on lack of respect, lack of trust for schools. We have to have these regimes which are very narrow. I mean, the word education means, as you all know here means, drawing out. You don't find education secretaries who have any understanding. To them, it is all about exams and accountability measures. It's never starting with these incredible young people with their multiple talents, they're often dented and crashed. So clearly, when that will change. When will that moment of sanity come? I don't know, 10-15 year. Till then, we have to do our best. You know, we can only work in the frameworks that we have. Otherwise, Ultimately we're out of jobs, and if we go because we are kicking too hard against the system that's going to let everybody else down. So we can work within to do what we can totally accept that.

Shahneila Saeed  23:47
Thank you. And we've got another question here. Well, there's several questions, but we'll come to this side.

Audience Member 3  23:52
And this question is for Chris, so in terms of your upskilling of AI, did you do that yourself? Did you come from a digital or techie sort of background?

Chris Goodall  24:04
Yeah, no, I'm not a techie. I'm a teacher. So I've been teaching for 20 years on and off, but I've always had an interest in technology. So in the schools that I've been at, I've seen how technology can make things more efficient, and really importantly, actually, in my teaching, make things engaging. So I'm not a fan of using technology for technology's sake. It always has to solve a problem for me. So I like the technology to be really invisible but do the things that I know work as a teacher, and looking at evidence based practice, we know some of the things that work, so it should support that. But I guess where I've sold myself, and I've been lucky enough to sell myself, I'm free, as Anthony says, but I don't mean sell myself in that way. I view myself as a bridge. I understand enough tech to be able to talk to people who don't and bridge that divide. And I think that's really important, is some people want to go to tech, and you go on some of the websites doing AI courses, and they start off with Python. I'm like, "teachers don't need to know that". They just need to know what it is, how it does, and how they can get something out of it that can help them. And you can do that fairly quickly. 

Shahneila Saeed  25:15
We've got time for one or two more questions. 

Audience Member 4  25:24
Can I ask where you see the curriculum opportunities to give the students a chance to engage with AI? Because since they messed around with the computing curriculum and made it all about programming and computer studies and science rather than making it computing more generally. Seems to me there's remarkably few opportunities in the curriculum for student engagement. 

Sir Anthony Seldon  25:47
If you want to just get one or two other questions, and we can ask them all together.

Shahneila Saeed  25:50
Yes, we can do that. So there's a suggestion here. Let's take one or two more questions and then we can be a collective response to them. 

Sir Anthony Seldon  25:57
Someone at the back had their hand up.

Audience Member 5  26:05
Bandura's experiments from time ago now, said that we learned from role models. So why is it with EdTech and AI now that we place so much emphasis on what's wrong and what's bad for you, rather than looking at what is a good role model and a good way of actually teaching it to our students? So it doesn't get swept under the carpet to some underground world where they're finding things out for themselves and maybe getting the wrong end of the stick. 

Shahneila Saeed  26:39
Would you like to address those questions collectively?

Sir Anthony Seldon  26:41
So many people here want to ask.

Audience Member 6  26:49
Very similar to the curriculum question that you had. But at what point do you think we should be exposing students to this? I'm in primary, and I'm looking at our computing curriculum at the moment. So at what age or at what time, is this appropriate for them to be exposed to it? And how?

Chris Goodall  27:06
Shall I take that one? I can only go from what we've done. I mean, we've focused very much on teachers and training staff. We haven't gone there with students yet, and we're about to do some small trials with students. First thing to pay attention to is the age of consent. You need to make sure you're following those age of consents. But I don't think exposing it to students means that they've got to be sitting there using it in their hands. I mean, again, there's a lot that you can get. When I was using it in the classroom, I was having constant back and forth conversations with the kids about that. They're all using it. So they don't necessarily need to use it in the classroom. They're all using it at home. So get teachers using it, and then get those conversations going, constant conversations. And I don't think it should just be in the computing curriculum. It's not computing. It's AI in society. Every subject and every subject teacher should be using it to support learning, but talking to the kids about it because the kids are using it. They're getting homework in with it. We've got to just keep talking. When we had Covid, we talked around it everybody we met, we banged on to everybody about your experience of Covid. This is bigger than Covid in society, yet we only seem to want to talk about it in computing curriculums or in school. It's got to be taught and spoken about everywhere, it's a societal thing.

Sir Anthony Seldon  28:20
So formation of human relationships. If we don't let young people, if those early year relationships, those early attachments, don't form, because the parents are just off on their own gadgets, the children's got the gadgets. If we don't have those relationships in primary school, between the children, the teachers, they are foundational for life. We know that our early relationships model our future relationships for life. So I think we have to be very careful about the way that we introduce this and we have to put always at the heart, the test is, is is it in the interest of the child? Is it deepening the child's humanity, understanding, curriculum, knowledge, or is it not? And we have to discover work together to get these tests and then apply them. And you know, it works very differently in different subjects. It gives access in science, incredible abilities to experiment and see scientific ideas in ways that when you were at school you never had. And we have to stress the positives. There are a lot of very Doom mongering people out there who are not seeing the positive. And we have to do that. We have to shape it. And because if we don't, if we don't shape it, as Jonathan Haidt, this guy who is trying to ban phones for very good motivations, maybe not the right solution, said AI is going to be so geometrically, many times bigger and potentially more harmful. We are on the cusp to get back to the first point of something so enormous. We have to grasp it. We have to run towards the fire, as every great teacher does not run away from it. This is the fire. We can do it together.

Shahneila Saeed  30:27
And on that amazing note, I'm going to have to wrap this session up, which is to just show you, I think we could have had another half an hour added on to this so easily. Thank you both so much for giving such an amazing discussion.

Sir Anthony Seldon  30:39
Thanks for coming, everybody, and good luck.

Shahneila Saeed  30:41
Thank you. Thank you very much.