For the final TeachMeet of the year, it felt important to discuss Artificial Intelligence, not only for the impact it will have on education (and is already having), but just the impact it could have on multiple disciplines and careers that will ultimately affect our students. This time, I took a slightly different approach and conducted some ‘provocations’ via Mentimeter, then took staff through some of the research, as well as sharing some practical tips for using AI. As always, I want to give full credit where it is due. Most of the research and ideas have come from Daisy Christodoulou’s book ‘Teacher vs Tech’ and from the AI in Education Conference organised by Anthony Seldon at Epsom College in May. For those of you who would like to watch the TeachMeet back, here is the recording: AI in Education – Friend or Foe (Recording).mp4
Provocations:
It was positive to see that no one was looking for a career change yet, and it is interesting that assessments were viewed as the area that will most likely be impacted. This probably comes from the fact that tools like ChatGPT have already started causing issues for coursework and homework. But, rethinking how we assess our students, isn’t necessarily a bad thing.
Although the awareness of AI is there, given we know how ‘time poor’ the teaching profession is, it’s probably not surprising that most have not yet had time to use AI in the classroom yet (or not necessarily realised that they are). One of the main goals of the TeachMeet, and this blog, is to try and equip teachers in how they can best use AI in the classroom.
Concerns
To start, we can’t ignore the elephant in the room, AI, for some, seems concerning. For us as teachers, the main concerns are probably more around ‘plagiarism and fake news’, regulation and pedagogically speaking, the potential loss of knowledge.
Fake information is nothing new, and neither is AI. However, the past few months have seen a huge increase in awareness of AI and thus the concern around how it is growing. This is mostly down to the easy (and mostly free) access to ChatGPT. For those of you who haven’t heard of ChatGPT, this is an AI chatbot and language model that generates human-like text and responses. For this reason, plagiarism is one concern and JCQ have recently given information on expectations around the use of it. They don’t expect it to be banned (and I’ll go into why that’s a good thing later) but they do expect it to be referenced. However, with the battle of the AI plagiarism checkers (where AI generators are claiming they can’t be traced and new checkers claiming they can), this is still a difficult field to navigate.
Another reason the ‘fake information’ stemming from ChatGPT is concerning, is how convincing this can be. It’s important to pause here and explain exactly what ChatGPT is, GPT stands for generative pre-trained transformer which is essentially a type of large language model that is making predictions using huge data sets. This can lead to a phenomenon called ‘hallucinations’. Hallucination in AI is when what is generated sounds plausible but are either factually incorrect, unreliable or misleading responses. These outputs often emerge from the AI model’s inherent biases, lack of real-world understanding, or training data limitations (for example, ChatGPT has a knowledge cutoff date of September 2021). But as a generative language model, it will give an answer to pretty much anything you ask, which means it is very good at being ‘confidently wrong’. This is particularly concerning as although AI models may be correct 90% of the time, if a student is seeing and believing the 10% that is wrong, this leads to misconceptions in their learning, which of course is incredibly dangerous. But this also exposes students to the biases of these models too.
At the moment, there is little regulation around AI. Again, we are seeing concerning headlines, including the creator of ChatGPT, Sam Altman, who admits that A.I. could cause ‘significant harm to the world’. In March, the Department for Education released a document ‘Generative Artificial Intelligence in education’, which was mostly positive about its use. One particularly important area to note within the Government document is around GDPR: Personal and sensitive data should not be entered into generative AI tools to ensure privacy. This seems sensible, however, there are certain platforms out there which do advocate for using AI to write reports which could encourage people to enter personal data, thus potentially breaching this. It is important that schools are giving guidance to staff about it’s appropriate use.
The final concern, which I also hope to counter with ‘benefits’, is the possibility of Loss of knowledge and the risk of widening inequality. This stems from issues already raised such as ‘hallucinations’ but also digital literacy. We saw with COVID the different impact that online learning had on different areas of the UK alone, pre-2020 we never would’ve thought that ‘wifi poverty’ was a ‘thing’ whereas now we need to not only think about this but also the impact of AI Literacy.
This shows the correlation between literacy rates and GDP (is a measure of the size and health of a country’s economy), therefore we know that literacy impacts economic success. As we move forward, the same is likely to be true for AI Literacy. But by this I don’t mean just having access to and using AI, I mean the ability to understand when AI is correct and not correct, having wider context and understanding that can then be applied to AI. This idea of context is something that Daisy Christodoulou looks at in depth in her book Teacher vs Tech, and brings me neatly onto looking at the research behind embedding technology effectively, as thus, some benefits of AI.
Benefits
So the first bit of good news, we will not be replaced by robots or ChatGPT! Panic over. And one of the reasons for this is the ‘novice to expert’ dilemma. Just like when Google was first invented and people assumed students could ‘just Google it?’, teachers weren’t made redundant. This is because AI (and Google) needs to be used by those who already have a certain level of understanding in the topic that they are using it for. Therefore, you cannot be a novice in the subject matter that you are using it for. Students who try to use it as a shortcut will not really learn anything and will inevitably get caught out, as they wont have the base knowledge to use it or to spot when it is wrong. (as AI is very good at being ‘confidently wrong’ or hallucinating). Therefore we as teachers are actually more equipped to use it and verify what it’s putting out there.
So firstly, I’m going to explain some of the key ideas on cognitive science that Daisy examines in her book to better explain why AI will not replace educators. It’s important to remember that cognitive science underpins AI. So if we are impressed by how these tools work, it’s because they have ‘learnt’ in the exact same way that we do. They are quicker and have a lot more data stored, but let’s not forget that the only reason they learn is because there is a cognitive architecture which is common to humans, and they replicate this. Thus, humans still need to learn in traditional ways and the reasons why are rooted in Cognitive Science. AI cannot be a shortcut to our human learning, but something supplementary.
In the book, Daisy explains how our working memory limits our ability to learn. Learning is like building blocks, to learn effectively, we need to have certain information already in place (or stored) in our long term memory before we can build on it and learn more. If we try to learn too much at once, we are overloaded and nothing really stays in our long term memory. Cognitive load theory argues that the magic number for how much we can remember is 7 (plus or minus 2), but this slightly changes when we already have information rooted in our long term memory.
Take these 18 letters (you have 5 seconds to memorise them all):
SBB CNH SGM TRA FIT VFA
How many?
Now, you have 5 seconds to memorise the following 18:
S BBC NHS GMT RAF ITV FA
How did you do?
They are the exact same letters, however, these letters are structurally similar to acronyms familiar to most British people, thus we will remember more because we are now using our long term memory to simplify the task.
Another example to explain why having information already stored in our long term memory is important, is context. We need context to understand, so again, there is certain knowledge that we just need in our long term memory. For example, the word ‘bank’ and its context:
Bob stood in the lush meadows and listened to the sound of the boat oars. He could see John was standing by the bank.
OR
Bob stood outside the train station and watched the pinstriped financers rushing by. He could see John was standing by the bank.
The last sentence in these descriptions is exactly the same, ‘He could see John was standing by the bank.’ but due to the context of the prior sentence, we would have two different images in our mind.
A final example of why we need certain knowledge in our long term memory is this very common scenario; who has ever asked a student to improve their writing by using more complex vocab? Take the sentence “I am a good footballer”. Have you ever had results like this “I am a congenial footballer”? Congenial means pleasing, pleasant etc, so you can see the similarity to ‘good’. But the sentence makes no sense.
Background knowledge in our long term memory or prior knowledge helps explain the difference between novices and experts. If AI is used as a shortcut in knowledge acquisition, students will not learn anything.
The good news is that any novice can become an expert, but this is by developing these knowledge structures in the right way. And how we do this, is building knowledge through direct instruction. (This is why teachers won’t be replaced). Hattie’s 2008 meta-analysis ‘Visible Learning’ shows direct instruction is one of the most effective approaches. This creates the building blocks of learning.
Now some may argue ‘What about constructivism?’ Which is the idea of constructing meaning for ourselves. Yes, this is important but it doesn’t mean guidance is harmful. Structured approaches help us to construct meaning whereas unstructured approaches can leave us confused or our working memory overloaded. This is one of the main issues that Daisy highlights with project based learning which isn’t well structured or expects ‘real world application’ too soon in the learning process.
This in turn can lead to the ‘Knowing-doing gap’. The knowing-doing gap is more likely if students are taught in an unstructured way, – for example, when students know they should use capital letters for proper nouns and at the start of sentences, but don’t do so. This is because they have been taught how to write with capital letters by ‘doing writing’ e.g. by practising writing stories, letters and lengthy articles with a ‘real world purpose’. This overloads working memory because they are not just thinking about different technical aspects of writing but also the details of the topic. Therefore they need direct instruction on the smaller skills of writing a sentence that starts with a capital letter until this becomes a habit. Only then can they move on to these ‘real world’ application tasks. This is why we need certain knowledge in our long term memory.
So this is why we can’t just ‘google it’?
Apply to internet searching (and ChatGPT): looking something up on the Internet or in any reference source, takes up valuable and precious space in working memory. Also, in order to look something up successfully, you need to have some idea both of what it is you are looking for, and what you expect to see.
We need facts to think: we need facts in our long term memory. Everything we see, hear and think about is critically dependent on and influenced by our long-term memory. (Krischner PA eg al., 2006). This is also true as we start considering the use of AI and ChatGPT. As we saw earlier, ChatGPT isn’t always correct and it can be very convincingly wrong. Therefore our students still need to learn facts and have these as part of their long term memory in order to fully understand what they are looking at via AI.
Engagement: The multimedia principles
So now we have established that teachers are here to stay. How can we, as teachers, embrace this new world of AI? Firstly, we can ‘create memorable content’ that supports cognitive science. Such as through applying Richard Mayer’s principles of multimedia learning.
The multimedia principle is the concept of presenting text and images together, which enhances learning. Our limited working memories have two channels, one verbal and one visual so presenting both uses the full capacity of our working memory and allows us to build more sophisticated mental representations of the concept we are trying to learn. There are actually 30 multimedia learning principles. They do not depend on certain technologies, but they should be applied to how we create resources, including those using technology or AI.
But many of these get violated; e.g. images chosen for decorative purposes can end up confusing or distracting learners. Therefore presentations and video content needs to be carefully constructed. Some of the best ways to do this is by embedding questions in videos that let students check understanding and frequently using mini quizzes or end of unit quizzes (EdPuzzle and Google Forms).
One very simple example of AI is using a platform called https://www.d-id.com/ (although it is not free), which is a website that can animate faces and use text-to-speech to being images to life (quite literally). You can even use ChatGPT to generate the speech or to tailor text that you already have into a more simplistic way. To make it really effective, on the slide of your AI creation, you could have key terms followed by some quiz questions. But even as a ‘hook’ or even a revision task to set students (once you are confident they are not novices), AI can bring some ‘life’ into our lessons.
Engagement: How can we use technology to personalise learning?
According to PISA, personalised learning is the biggest indicator of academic success (only below wealth). However, this is an issue when we have ‘one teacher, many students’. Who here has marked a set of assessments and there are x amount of students who have one knowledge error, x amount with another, one student missing due to illness etc and x amount ready to move on… what do you do? Technology, and AI, could be the solution to this problem.
But, there is also the bigger question of ‘what do we actually mean by personalised learning?’
3 examples that Daisy looks at:
1. Individual learning styles
2. Student choice
3. Adaptive learning
Individual learning styles is a theory that doesn’t just claim that we have a preferred style or a best style, it claims that if we are taught in our preferred style, we will learn better. And that is not the case. When we are learning, what matters most is not our preferred learning style, but the best learning style for the content. For example, if you want students to learn the locations of countries on a map of Africa, then a visual presentation of the material will be best. And this will be best for ALL students, regardless of their preference. Similarly, it can be useful for material to be presented in more than one way: this can be beneficial for more than one student. Eg why it’s important to combine words and images. Another issue with ‘learning styles’ is that they can enforce stereotypes. Eg if a child prefers watching to reading, they will only choose this learning style and then never get any better at reading. Therefore, rather than providing students with instruction exclusively in the style that appeals to them, we should be looking at the specifics of the content and the research-backed principles about how we manage information. So AI platforms that adapt to the learning preferences of students, can limit the students ability to actually learn, we as teachers with pedagogical knowledge need to select the appropriate tasks and technology.
Student choice or a ‘self paced environment’ can be viewed as a way to stop students getting bored or lost by the pace. However, the research literature shows that learners are not well equipped to make good decisions about their learning, precisely because they are still learning. Similar to the expert/novice example; to be able to make good decisions about our own competence in a particular area, we need to already possess a degree of competence in that area. (the Dunning-Kruger effect). Equally, we may feel that repeating an easy task over and over is boring, and not helping, but sometimes ‘overlearning’ can be valuable as it builds fluency (and reduces cognitive load). We can also struggle when learning something for the first time and think this is a sign to slow down or stop, but that’s also not necessarily right as is shown by Bjorks ‘Desirable difficulties’. Therefore, students, especially ’novice’ should be guided in their learning choices until they can make good choices.Once again, AI platforms that encourage student choice are only appropriate once students are already ‘experts’.
Adaptive learning or adaptive systems will change questions based on how students answer questions etc. They also provide students with hints and tips in the same way a teacher might. As prior knowledge is key for students and students are not always able to fully access their learning goals, personalised data is needed to help teachers and students make better decisions about what each student needs. These adaptive systems will change based on this data in real time as students are using them. These systems are capable of providing different pathways through content for every student. The evidence in favour of such programmes is fairly positive, although not completely consistent, the impact is always positive (but the size of impact varies). Therefore this is the ‘best’ form of personalised learning, and one that AI can support. So with this, the question is more around, how should this be used? Should it replace classroom instruction completely, be integrated with classroom instruction, or only used as a separate homework programme. I think COVID has taught us that we cannot simply remove the interactive or social element of physically being in the classroom, however, this is where AI could really enhance, and level up, education.
Workload
The million pound question; ‘how can we reduce teacher workload? There are many ways that AI and ChatGPT have the potential to reduce teacher workload.
One such way is through automating administrative tasks. For example, letters and emails. As report writing is still a heavily debated one, I’m not going to suggest that (this is for individual schools to decide under the Government GDPR guidance). But by automating other time-consuming tasks, teachers can focus more on the right instructional activities and building student engagement.
ChatGPT can allow teachers to effectively create personalised learning experiences for their students, using their expert knowledge. For example, you can ask ChatGPT to create revision summaries or create scaffolds that would support the needs of specific students. It can even create quizzes with multiple choice answers. Again, teachers have the knowledge of their students’ individual needs and the most appropriate tasks, so this simply speeds up the way in which teachers tailor resources to them.
Finally, lesson planning and resource recommendations. There are some new (and some free) AI platforms that can assist teachers in creating lesson plans by providing resources, suggesting relevant activities, and offering a wide range of teaching strategies. With our ‘expert’ knowledge of cognitive science, we can select the most appropriate ones, especially when suffering a bout of ‘teachers block’ when sitting down to a lesson plan.
Opportunities
Out of this AI ‘enlightenment’ revolution, there are many exciting new opportunities and tools. What is important is to not forget the cognitive science when looking at them. Equally, remembering that there are a lot of new platforms that want to make money (which given how little money is in education, is a concern), so check out it’s free versions or demos beforehand to ensure that it is worth it too.
- ChatGPT prompts: One way we can reduce workload and start getting more familiar with applications like ChatGPT, is by asking it the right types of questions. On Twitter, there are lots of recommendations around ‘prompts’, one account which is particularly helpful is Zain Kahn (@heykahn). He also has a newsletter that you can sign up to called ‘Superhuman’. This gives frequent tips on prompts for ChatGPT and other AI platforms. It’s free to sign up too.
- Another useful and free tool for ‘prompts’ comes from Daniel Fitzpatrick: 40 Proven AI Prompts for Educators [THIRD EDITION] by Dan Fitzpatrick – Issuu
- Daniel Fitzpatrick is also the creator of of the website, ‘The AI educator’ and on here you can see lists of different platforms. You can also filter them and one filter includes ‘free’ or ‘freemium’. It’s important to note here that ChatGPT is listed as ‘freemium’ however, there is not really any need for Teachers to sign up to the paid version, we can do so much in the free one. Equally, D-ID who i referenced earlier is also ‘freemium’ but this is only a 14 day free trial, so not a very good ‘freemium’ unless you’ve got 14 days to just sit down and create. AI Educator Tools
- Teach Mate AI: This has been created by Mr P ICT, who you might recognise from social media. He often posts memes about education as well as giving lots of practical Teaching and Learning advice. This new platform has a suite of AI-powered tools capable of generating bespoke teaching resources and simplifying various elements of a teacher’s job. It does have tools that are free so you can get an idea of what TeachMateAi is capable of, but for full access, you need to sign up for a Pro account.TeachMateAi
- Study Hall AI: This is a platform I saw at the Epsom College Conference, it’s not free so I do not have extensive knowledge of it, but this really appealed to me as it is using AI to support reading and maths, and the feature ‘Deep reading’ enables students to build knowledge and most importantly, engage with the text they are reading. The tasks it is promoting are clearly rooted in cognitive science. Studyhall AI
As well as these ‘newer’ platforms, Daisy Christodoulou also highlights several edtech platforms which were already embedding cognitive science to enhance student learning.
- Cerego: this is an adaptive platform which doesn’t come loaded with content, but instead allows teachers or groups of teachers to add or create their own content.
- Eedi (Diagnostic questions): this looks at thousands of student responses to questions to help teachers learn more about their students thinking
- Anki or Quizlet: these are flashcard apps to create your own flashcards
- Up Learn: this is an independent study programme designed for A-level students in certain subjects which has videos and quizzes linked to specific exam boards.
- Memrise and Duolingo: are both online language learning platforms
- Hegarty Maths: has videos and thousands of practice questions which replace and mark maths homework
- Smartick: is an adaptive maths programme designed to be used for 15mins a day (4-14 year olds)
Final provocations:
Before rounding off the TeachMeet, I wanted to poll teachers on the first provocation as well as a new one.
Positively, no one thought it would replace teaching. With regards to the benefits, it was more split, but atleast only two felt that teaching would become more like ‘crowd control’. I think that as professionals it is fair to say that we are not fully there yet. There is a lot out there and so the time to find the right tools is still limited. However, it is good to see that some are excited about how much more creative we can now be in the classroom.
Conclusions
Schools aren’t going anywhere, and neither are teachers. This is because:
1. Education is concerned with more fundamental skills and knowledge that are less likely to be overturned by technology. Self-driving cars may remove the need for driving instructors, but it’s hard to imagine a world that doesn’t need people who can read, write and do basic mathematics.
2. Skills which are economically valuable are dependent on knowledge skills. You won’t be getting a job based on your ability to recite the times tables, but this basic mathematical knowledge allows you to understand more complex ideas. The same is true of creativity and innovation. Think of recombination innovation we looked at in the last TeachMeet on Rebel Ideas, you need to have knowledge of two different concepts or ideas to create a new one.
3. And finally, schools should not be solely concerned with the economic value of students . We teach literature, not because it has direct economic benefit, but because it helps students understand the world.
So schools just might look a little different in the next few years, and this could be amazing. We still need teachers to guide students in how they use technology and AI to ensure that they are still building memory, being inspired and encouraged to think creatively, and we need AI to help liberate teachers’ time, not to replace it.
References
Christodoulou, D. (2020). Teachers vs Tech?: The case for an ed tech revolution. Oxford University Press
Fenichell, S. (2022), StudyHallAI: https://studyhall.ai/
Fitzpatrick, D. (2023). 40 Proven AI Prompts for Educators [THIRD EDITION]: ]https://issuu.com/theaieducator/docs/13_ways_chatgpt_can_reduce_teacher_workload_1_
Fitzpatrick, D. (2023) AI EDUCATOR TOOLS, Thirdbox LTD https://aieducator.tools/
Parkinson, L.(2023) a.k.a @ICT_Mr P TeachMateAI: https://teachmateai.com/
Seldon, A. AI in Education Conference (May, 2023), Epsom College: https://www.epsomcollege.org.uk/whats-on/ai-in-education/