Brain Nervous System concept. Science is something that children should study and learn. Thinking process and Psychology of Kids.

The next gener-AI-tion

Our children are growing up in the age of AI. But how is this generative technology impacting their learning? We speak to some local schools and discover how they are utilising the positive potential AI can bring to education, while at the same time managing ethical issues.

How many times have you interacted with artificial intelligence today? It’s probably more than you think. Even if you’re not actively using AI tools in your professional or personal life, AI is being used to generate the adverts we see, provide customer service, conjure artistic works, assess risk in insurance applications… its capabilities are growing each week. AI is certainly nothing new, but the way we are using it has changed dramatically in the last two years. Especially in the education sector.

While traditional AI is best used for spotting patterns and problem solving (proving popular with companies handling data), generative AI (GenAI) can create new content – be that audio, visual or language. The use of GenAI exploded in 2022, when OpenAI launched an early version of the GenAI chatbot ChatGPT. Since then, anyone with an internet connection has had the power to generate any content they like using this technology.

According to the January 2024 report Generative AI in education: Educator and expert views by the Department for Education and The Open Innovation Team: “Teachers and experts acknowledge that GenAI could have a transformative impact on education. From helping teachers save time by automating tasks, to improving teaching effectiveness by personalising learning for students, there is significant potential for GenAI to benefit the sector. At the same time, there is considerable concern about the risks it presents, as well as scepticism about whether these can be mitigated.”

Research from the report shows that GenAI use among students and teachers has soared rapidly over the last year or so. By November 2023, 42% of primary and secondary teachers had used GenAI in their role (an increase from 17% in April of that year). Pupils and students may be using GenAI more than their teachers – of 16- to 24-year-olds online in the UK, 74% have used a GenAI tool. The report highlights a survey from TeacherTapp (November 2023), which revealed 42% of primary and secondary teachers have now used GenAI to help them with schoolwork.

Meanwhile, in February 2024 The Guardian shared the results of a survey of more than 1,000 UK graduates conducted by the Higher Education Policy Institute. This found that 53% were using AI to generate material for work they would be marked on. One in four are using applications, such as Google Bard or ChatGPT to suggest topics, and one in eight are using them to create content.
Bath’s pupils are preparing to enter a world of further learning and professional work that will without doubt interact with AI on an increasingly regular basis. So, to ensure that they are aware of these tools’ positive potential (boosting efficiency and aiding special educational needs development), as well as the ethical and intellectual risks they pose (such as factual inaccuracies, plagiarism and an overreliance on technology), schools have an important role to play.

That’s why we asked our local educational institutions three key questions to to see how GenAI is impacting their curriculum.

The big AI questions:

How is AI software being incorporated into your curriculum for creative and problem-solving purposes?

Does your school offer any AI training for students, and is there information available for parents?

Is there a system to monitor student use of AI in submitted work – and to safeguard against its misuse?

Kingswood School
Dr Rachel Mcllwaine, Deputy Head Academic

• With the boom in AI, we are keen to harness its exceptional power, yet also educate our pupils about possible pitfalls and dangers. We use an excellent AI tool in Maths, which is able to tailor the difficulty of the question to the user, and we also use AI in our Middle School curriculum in Computer Science, Global Goals and more to develop pupils’ learning. We educate our pupils on the ethics of AI through our Kingswood for Life programme, which we have adapted to reflect this new technology.

• We ran a very popular parent engagement evening last half term in which we discussed the origins of Large Language Models (LLMs), a type of programme that can recognise and generate text, and ChatGPT, a chatbot launched in November 2022. We were fortunate to have some very qualified parents in the audience who understand this world more than we do, and have since involved them in our AI strategy and development.

• We ensure that while pupils are able to harness the power of AI they understand the implications for plagiarism. Through assemblies and our Kingswood for Life lessons, pupils are educated about the importance of referencing and checking the use of AI in coursework and homework, and we have a staff and pupil policy on the use of AI in assessment.

Millfield School
Gary Henderson, Director of IT

• AI software is used in the curriculum at Millfield for problem solving, ideation, summarisation, translation and recommendation. These are just some of the benefits which AI can provide.
There are logical points for the inclusion of AI into the Computer Science curriculum and in lessons on research skills, where we can teach students both about AI and how to use AI tools. We also look to include it across all subjects, given that this has the potential to impact all areas of knowledge.

• Students are already using Generative AI tools independently, and so it is important to engage with students about the appropriate use of these, along with having discussions around the risks and challenges. We want students to benefit from the potential of AI tools, but equally they must be aware of the risks and constraints such as those related to coursework, where the use of AI means the work has not been produced independently. At the same time we need to provide information to parents around what is acceptable, balancing the benefits and the risks.

• When it comes to monitoring student use of AI in submitted work, AI detection platforms exist, but these have not proved reliable. As such, the key is building awareness with students as to what is acceptable and what is not. We also ensure high-quality teacher student relationships and teacher professional judgements so as to help identify where misuse may have occurred.

Royal High School
James Moyle, Assistant Head, Curriculum

• Our ongoing commitment is to support students in acquiring practical skills and educating them about the advantages of AI, which helps to foster the growth of their critical thinking abilities.

• We hold assemblies and workshops for students to encourage academic honesty and appropriate use of AI. The use of AI has also been added into our teaching. In Year 8 Computer Science for example, we have created a new teaching module about AI, where our students learn about chatbots, how to spot bias in AI output and how to critically assess the quality it produces.

• We currently use systems (Turnitin and Gptzero) to check whether essays that have been handed in are likely to have been created by generative AI. At the moment, our teaching staff are still the best detectors, but these technologies help teachers with this process.

Beechen Cliff School
Tim Markall, Headmaster

• We are exploring how AI can be used to support our processes and teaching and learning. Over time we expect this to develop, though for us it is still relatively early days.

• We have safeguards in place to reduce the risk of AI misuse, including the blocking of chatbots and AI tools on school systems to protect the integrity of assessment.

• Students completing coursework are provided with information around Exam Board Regulations on the use of AI, and JCQ (Joint Council for Qualifications) guidelines are provided to students and parents. We have updated systems around the checking of NEA (Non Exam Assessment) work to safeguard against misuse of AI.

Prior Park College
Chris Gamble, Deputy Head Academic

• At Prior Park, we nurture AI literacy and creativity. As part of our Computer Science programme, all Year 7–9 students delve into AI fundamentals, distinguishing supervised and unsupervised machine learning, grasping neural network concepts and pondering the bigger questions of consciousness and sentience. We also encourage creative exploration of generative AI – the students learn about the architecture of commercial AI systems by building simple neural networks of their own, and are taught how to use industry-standard AIs like GPT-4 in the most effective ways.

• Running through all this, though, we prioritise ethical AI use, instilling responsible practices and a good sense of what AI can, and cannot, do, so our students never see it as a substitute for their own spark and creativity.

• While systems like Turnitin claim to ‘detect’ AI in students’ work, at Prior we find it’s far better to rely on those personal relationships between students and teachers, who have an intimate – and often uncanny – understanding of their students’ work and writing styles. Interestingly, with that approach and our focus on ethical, positive use of AI, we see very few issues with students looking to misuse the technology.

Stonar School
James Cole, Head of Computer Science

Stonar School in Wiltshire is challenging the misconceptions surrounding artificial intelligence (AI) through its participation in the innovative hi!ai project, offering pupils a deeper understanding and practical experience of this transformative technology. This ambitious programme, involving Stonar and 11 sister schools in the Globeducate group, is engaging pupils of all ages (Reception to Upper Sixth) in a collaborative journey to explore AI research, design and engineering, with a final goal of creating their own AI ‘brain’.

The project’s vision is to see pupils from across the globe co-create and innovate using AI. They will use their creations to bring historical figures to life in a virtual AI studio, showcasing the potential of this technology.

“It’s really exciting to be part of this project because we are recognising the importance of the AI developments and how it could impact our education and our future,” says Charlotte, a Stonar pupil participating in the project. “AI has become a hot topic, but often shrouded in misconceptions. It can’t just be seen as a tool for homework ­– it’s a constantly evolving field that we need to learn to integrate into our lives.”

The project is now fully underway, with pupils chosen for working groups dedicated to various areas including technology, ethics and design. Enthusiasm around the school has been particularly high with the recent test printing on the 3D printer. James Cole, Stonar’s Head of Computer Science, emphasises the unique opportunity hi!ai presents for Stonar pupils, saying: “The project provides pupils with a valuable foundation in AI technology, fostering critical thinking and problem-solving skills crucial for success in the ever-evolving job market.

“AI is undoubtedly coming to education, however, this project serves as a prime example of how educational institutions can bridge the gap between the hype and reality of AI. By fostering a culture of exploration and understanding, hi!ai empowers pupils to become responsible and informed citizens in a world increasingly shaped by artificial intelligence.

“We hope that all our pupils involved will become our digital leaders of the future, guiding the school on further AI use in teaching and learning.”