Generative Artificial Intelligence: Schools Debate
Full Debate: Read Full DebateAfzal Khan
Main Page: Afzal Khan (Labour - Manchester Rusholme)Department Debates - View all Afzal Khan's debates with the Department for Education
(1 day, 22 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move,
That this House has considered the use of generative artificial intelligence in schools.
It is a great pleasure to serve with you in the Chair, Sir Jeremy. You will not be presiding over a heated political debate this afternoon, and I hope this is a good opportunity to openly discuss the enormous change that is upon us. Throughout the education world, a lot of thinking is being done about artificial intelligence and its implications. Most of that thinking is being done by teachers, and I hope we will contribute to some of the wider system and policy questions today.
In a November 2024 omnibus survey, half of teachers said they had already used generative AI in their role; about a quarter said they had no plans to; and the rest either planned to or did not yet know. The most common uses of generative AI are creating resources, planning lessons and communicating with parents.
In the same survey, 6% of teachers said that pupils were permitted to use AI in the work they are set. It is hard to pinpoint an exact number, but it is fairly safe to say that the proportion of pupils actually using AI to some degree in their work is rather more than 6%. The survey data that we have, incomplete as it is, suggests that somewhere between one in three and three in four children are using AI to some degree, with some frequency, in their homework.
In this rapidly changing world, I commend the Department for Education for its guidance update in January and the materials that came out on 10 June, made by the Chiltern Learning Trust and the Chartered College of Teaching. Those materials are easily accessible, but that does not mean that masses of people have seen them. This remains an area in which a lot of communicating remains to be done.
The DFE guidance talks about balancing the safety considerations of AI with the opportunities. Those are definitely two considerations, but they are not the only considerations, nor are they the most obvious. We always have to remember that at whatever pace we work at in this place, or that Whitehall works at, kids will work at about six times that pace.
The four areas I will briefly cover today are two areas of opportunity and two areas of risk, where we need some caution. The areas of opportunity are workload and enhancing learning, particularly for children with special educational needs. The areas of risk are the need for discerning and careful use of generative AI by pupils and, finally, the impact on homework, assessment and exams.
Workload is a big issue for teachers. Historically, along with pupil behaviour, it has often been the No. 1 issue in teacher retention. In a 2019 survey, it was encouraging to see that the reported workload had reduced by about five hours a week, or an hour a day. However, workload remains a stubborn issue, and the biggest contributors have been planning and preparation, marking, and data entry and analysis. There is also communicating with parents, dealing with behaviour issues and so on, but those first three all lend themselves to artificial intelligence.
In particular, the creation of teaching materials seems to be an enormous area of opportunity. Too many teachers spend their Sunday evenings at home, trawling the internet for resources to use during the working week, when much of that work could be done for them. I commend Oak National Academy for its work on the AI learning assistant.
As the Education Committee, chaired by the hon. Member for Dulwich and West Norwood (Helen Hayes), discussed just this morning, the national curriculum is, of course, a framework. It is not a precise set of things that children will learn, because we need diversity in provision. We therefore need to think about how AI can support that diversity. I hope the Minister will give us an update on the content store announcement of August 2024.
AI has plenty of other potential uses, such as for timetabling, for letters and emails home—although my special request would be that we do not add to the volume of communications that have to be consumed—and for report writing. But we need a clear code of practice, because for the trust of parents, and indeed of pupils, there needs to be clarity about when and in what ways generative AI has been used. Care will still be needed. How does a teacher tell children that they must write their own work, if that teacher is generating text through a machine?
The second area of opportunity is in supporting learning. There is clearly a lot of potential for learning personalisation, especially for certain types of special educational need, alongside the role of assistive and adaptive technology. For some subjects, AI will also be useful for dynamic assessment. But schools will have the same issue with AI as with all educational technology, which is that they generally do not have much idea which bits of it are any good. There are both great products and pretty low-grade products, and discerning one from the other is notoriously difficult. As with edtech in general, but even more so with AI, product development cycles do not lend themselves to randomised controlled trials to try to establish what is good. I suggest that schools require an extension of the principle established with the LendED product, which is made with the British Educational Suppliers Association: a sort of forum where teachers can see these products, distil the good and bad, and crucially get recommendations from other schools and teachers, because teachers tend to trust teachers.
With the input of biased data, large language models are liable to produce inappropriate, biased or factually incorrect outputs. Does the right hon. Member agree that if generative AI is to be rolled out to schools, actions such as introducing statutory regulations must be taken to limit any bias or misinformation taught to children and young people?
Funnily enough, I agree with the hon. Member, though not necessarily about statutory requirements. It is certainly true—in fact, he inadvertently leads me on to my next point—that we need to be careful and discerning in using these products. There are many risks, including the safeguarding risks inherent in technology, hallucinations, dud information and, as the hon. Member rightly says, biases.
There are some very direct and sharp risks to children. I am afraid that many misleading, unpleasant and cruel things can be done with AI. They can be done already but, as with so many other things, AI magnifies and turbocharges the problem. Some of those things can be done by adults to children; some of them are done by children to other children. We need to be very aware of those risks, some of which relate to existing practices and policy questions, such as how to deal with intimate image abuse and sexting. The problem further supports the case for a comprehensive school-day phone ban, to take cameras out of schools.
More generally, there is a need for media literacy and general discernment. I am reluctant and nervous to talk about media literacy, and more so about the phrase “critical thinking,” because it is too often conflated with the false dichotomy that occasionally comes up in the educational world: knowledge versus skills. Clearly, we need both. We need both in life, and we need to have developed both in school, but knowledge precedes skills because we can only think with what we know. However, it is really important in this context that children know how AI can make mistakes, and that they come to trust and know to look out for the correct primary sources, and trusted brands—trusted sources—rather than just stuff on the internet.
In the 2019 guidance on teaching online safety in schools, since updated, a fusion of the computing, relationships and citizenship curricula was envisaged. Children would be guided through how to evaluate what they see online, how to recognise techniques used for persuasion, and how to understand confirmation bias, as well as misinformation and disinformation. The new edition of “Keeping Children Safe in Education”, which came out yesterday, lists disinformation and misinformation as safeguarding concerns in their own right for the first time. The online safety guidance also included the importance of learning why people might try to bend the truth on the internet and pretend to be someone they are not. That was a start, but at this technological inflection point, it needs a huge scaling up.
The hon. Member for Mansfield (Steve Yemm) should not misunderstand me, as I am not against regulation. His points about data protection and privacy are really important, although they are probably too big to fold entirely into this debate. His first group of points and what the NSPCC talks about are the same risks that I am talking about.
There is an even broader point, as there is already a lot of blurring between fact, fiction and opinion online. There are all manner of news sources and influencers, network gaming, virtual reality and augmented reality, the metaverse—the whole concept of reality is a little hazier than it once was. With these machines, which in some cases almost seem to have a personality of their own, there is a danger of yet more blurring.
We all shout at our PCs sometimes. Indeed, adults using AI may start to give human form, which is called anthropomorphism, to the machine they are interacting with—I occasionally try to be polite when I interact with one of these interfaces. Apps such as character.ai take that to another level.
We have to think about the impact on children in their most formative years—on their sense of self, their understanding of the world and their mental wellbeing. That includes the very youngest children, who will be growing up in a world of the internet of things and connected toys. It will be that much more important to draw a line between what is real, what is human, and what is not. In time, when the system has had enough time to think about it—we are not nearly there yet—that may be yet another area for regulation.
Finally, I come to the most immediate risks, around homework, assessments and exams. Colleagues may already have had a conversation in which a teacher has said, “Isn’t it brilliant how much so-and-so has improved? Oh, hang on—have they?” They now cannot be absolutely certain. There are AI detectors, but they are not perfect. They can produce false positives. In other words, they can accuse people of plagiarising using AI when they are not. In any event, there is an arms race between the AI machine and the AI detector machine, which is deeply unsatisfactory. Of course, that is where the teacher’s skill comes in, because there is always classwork to compare. Most importantly, there is always the exam itself, and we need to keep it that way.
The safest way to protect the integrity of exams is for them to be handwritten in exam conditions, with a teacher walking up and down between the desks—not quite for everybody, but for the vast majority of children, except where a special educational need or disability requires another arrangement. There are also subjects, such as art, design and technology and computer science, where it would not be appropriate.
There is already a big increase in access arrangements for exams. A particular type of adjustment, called a centre-delegated arrangement, does not need approval from the exam board, so no data on it is available. One such centre-delegated arrangement is to allow the child to use a keyboard—in the rubric it is called a word processor, which is a delightfully archaic term.
If children are allowed to use a keyboard, spellcheck and AutoText are disabled, to ensure safeguards are in place—but it is still true that most people can type faster than they can write, so there is a disparity in the two formats. The regulations require a school’s special educational needs co-ordinator to decide whether a child is able to use that facility, but they are still quite loose in that they refer to the keyboard being the child’s
“normal way of working at school”.
I would love the Minister to say a word about that. The Department for Education should be clear that, where such arrangements are made, it should be because of a special educational need or disability.
One concern I am beginning to feel is that, while acknowledging that the technological development is important, an over-reliance on generative AI runs the risk of limiting open-mindedness, independent thinking, literacy and creative skills. Does the right hon. Member agree that we must protect key critical thinking and reasoning skills in children and young people, for their future and ours?
The hon. Gentleman makes his point lucidly and well, and I think it stands on its own feet.
The bigger issue with more children taking exams on a keyboard rather than on paper is that exam boards would like to move entire exams online for all children. In a sense, that would be better because it would be equal; there would not be any difference in the speed of writing and typing.
Some might ask what is wrong with that, as long as it is the same for everybody, and as long as the internet, spellcheck and autocorrect are disabled. I suggest there would still be multiple types of security risk in having exams done en masse online. There is also a wider problem: if a GCSE is done online, how will students do a mock GCSE? They will do it online. How will someone do a year 9 exam? Hon. Members can see where I am going with this. It cascades further and further down the age range, until eventually people question why they are learning to write with a pen at all. Some are already asking that question.
By the time my child is an adult, people will not even be using a keyboard, but other types of communication and interface, and this will seem very archaic. There are important advantages to learning to write by hand, however. Handwriting and writing are not the same thing. The way someone develops their handwriting, learning the strokes and patterns and how letters join together is an important part of learning the skill of wider writing. There is also plenty of evidence that making marks on a page by hand aids visual memory. Handwriting helps us to understand things because, as we write, we synthesize what we are reading or hearing into our own words. There is even evidence to suggest that people do better in tests and have better recall as a result. Maintaining handwriting is therefore important in its own right, quite apart from maintaining the security and integrity of examinations.
DFE guidance states that teachers should keep up to date with this rapidly changing world. That is a tough ask. Over the months and years ahead, the Department will have to do a lot to provide teachers and school leaders with bite-sized, easily digestible chunks of information to keep them up to date with this rapidly changing area. A recent Ofsted report, on 27 June, said that there was not yet enough evidence to conclude what constitutes a good use of AI in schools, but that one common approach among schools that seemed to be using AI successfully was to have a champion who spreads good practice throughout the school. That seems to me a good approach.
Sarah Hannafin of the National Association of Head Teachers stated:
“The technology should be introduced gradually…to maximise its potential and mitigate the risks.”
That is an important point. Most immediately, I implore the Minister not to allow all exams to go digital en masse, except for certain subjects where that makes sense, and except, of course, for an individual child for whom that is the right thing because of their special educational need or disability.
I contend that there should be no rush to move to online exams. There might be opportunities, lower costs or easier administration involved, but there are certainly also risks, some of which are immediate and some of which would manifest only over time and might take us a long time to spot. If we do move en masse to online exams and away from pen on paper, I promise hon. Members that we would never go back. A cautious approach is what is required.