Daniel Emmerson 00:02
Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.
Today we have two amazing guests with us. Firstly, we have Becci Peters, who has been a computing teacher and head of department in schools for 10 years. Since then she's been a lecturer, teaching trainee teachers, an educational consultant for the NCCE, an author of textbooks and resources, and is now studying a master's in data science and AI as well as being the secondary education lead at CAS. Ben Davis is the computing subject lead for primary at BCS, the Chartered Institute for IT, where he focuses on empowering teachers with tools and knowledge they need to deliver outstanding computer education. Ben is passionate about inspiring both teachers and pupils to explore the exciting world of computing. He brings with him over 23 years of experience as a primary classroom teacher. Ben has held key positions such as lead practitioner for ICT for the local authority and specialist leader of education for computing and ICT. He has supported teachers through initial teacher training sessions, developed educational resources and co-authored books on primary computing. Thank you both very much indeed for being here today.
You have very different roles and responsibilities and I'd like to speak with you about the work that you're doing on a day to day basis, particularly your work with teachers and schools. But as an opening question, I think it would be great for our listeners to have a better idea of what the British Computer Society is and why they should be aware of it. Becci, can I ask you that first of all?
Becci Peters 02:08
Yeah. So BCS is the Chartered Institute for IT. So we are the membership body for professionals in IT. And that's the kind of the other side of the business that Ben and I have absolutely nothing to do with. Everything that Ben and I have to do with is the teacher facing stuff. So we have two major programs that we run, Barefoot Computing and CAS. So Barefoot Computing is our primary offering. So I have nothing to do with that. And that's all Ben. And then CAS Computing at school is the computing subject association for teachers essentially. So that covers, you know, everything from early years all the way through to HE. So anybody who is involved in teaching computing over the past set of 12 to 18 months, we've been kind of expanding our offerings to not just computing teachers, but also focusing on digital skills and AI.
Daniel Emmerson 02:56
Excellent, thank you, Becci. Ben, if we could just move to that Barefoot Computing and what that is as well and how that fits in here. That would be great.
Ben Davies 03:04
Yeah. So that's another program that's run by BCS and it was looking at when the national curriculum changed. So computing became a subject and there was the requirement for primary teachers to teach more than just the joke about making a PowerPoint and using floor turtles. There was a more focused and structured curriculum, making it more fit for the 21st century. So there was an element of computer science that was in there and there was a situation where lots of teachers who had no experience of that through their own education, with them being asked to teach these things. So it started off as offering support from that, demystifying some of the language that was used in that part of the curriculum in terms like variables and selection and trying to identify where people already use them in their normal practice. It's just they didn't know what it was called, those things. And then over the 10 years it's evolved, it's looking more into digital skills and more recently we've been providing resources for primary school teachers so they can introduce the concepts of AI to children. So it's not seen as this magical thing and people aren't talking about it with a reverence that makes it sound like it's not something that can be challenged or mastered. But yeah, and again, helping teachers understand what it is so they can have informed, make informed decisions about how they use it and how they talk to pupils about its use.
Daniel Emmerson 04:43
So I'm really interested in the organisation and the work that you're doing and we'll come more specifically to artificial intelligence in a little bit. But could you tell us a bit about the size of the organisation and the demographic? Because we're not just talking about the UK here, it goes way beyond that. I think listeners will be surprised to learn how substantial the organisation is and the membership that it covers. Becci, could you tell us about that?
Becci Peters 05:10
So the, the CAS membership. So we have 27,000 members of CAS and CAS is completely free to join, which is part of the reason we have so many members. So we're industry funded and that's how we're able to provide all of our services to teachers for free. Most of that membership is UK based, but we do have members from all around the world. So there's. There really are people everywhere. And I regularly get, we do live webinars and things and there's regularly people joining from America. So they have their own kind of computing subjects association, but they pay a fee to join theirs. They work in quite different ways. So you know, people getting hold of all the things that we're offering without necessarily being in the country that's providing it.
Daniel Emmerson 05:50
Okay, and how do they find you and what are they typically looking for if they're a new teacher who's come across this opportunity?
Becci Peters 05:58
Yeah. So you can visit the website computing@school.org.uk. You can find us on socials as well. Most teachers that are coming to us are either looking for kind of CPD or resources. So resources is one of our main things. On our website there's over four and a half thousand resources and any member of the community can upload resources to that to share. So you know, anyone who's made something and they think, you know, I think somebody else is going to find this useful, can add that onto the website and you know, people can search through that. We do do sort of commission based resources. So Ben and I have regular meetings and we'll kind of talk about what do we think people need or what have people specifically asked us for that doesn't exist. And then we'll commission somebody to create that resource and then put it out to the community. So resources is probably the main thing, but the other thing is the CPD. So there's kind of different forms of CPD that are on offer. So Ben and I lately have been turning our attention to kind of very short videos that kind of just show something, you know, like a two minute video. How do you do this thing? Or what is this concept? But then we also do live webinars. So we have 12 different online communities and most of those run a monthly webinar online. Again, all free to attend. And then there's also local communities. So different local communities around the UK will meet up in person with other teachers in their area and kind of, you know, discuss different things to do with computing education. But then the other big thing that we have is an annual conference and we've just announced our date for 2025. It's in October this year in Birmingham and again completely free for teachers to attend. Last year we had 150 teachers attending and we would have had more, but building capacity said no. We've already got nearly 200 teachers of registered interest for this year and we've literally announced the date two, three weeks ago. And last year there were about 65 workshops for teachers to choose from on the day. And it's just a really great place to just kind of share ideas and, and speak to other people. And you know, there's a lot of heads of computing in secondaries are the only computing teacher in the and can often feel quite kind of alone. So to have that community of other teachers that they can speak to is really important.
Daniel Emmerson 08:07
And when thinking about the role of these individuals in their schools. Ben, would you say that there's an increase in demand on what a computing teacher might need to facilitate within their school? Is it just about teaching a specific subject or has it become increasingly more than that?
Ben Davies 08:22
Oh, I think you have the demands of the subject but in many respects being the expert in that building in that institution for anything that comes under the umbrella term of ICT. So the example being eSafety online safety which is a safeguarding issue but quite often that join up isn't made if it's happened in the digital realm then the first person it gets reported to is the computer head of department in a primary school, the computing subject lead and they do become a person that becomes a bit of jack of all trades. Anything to do with the digital space, regardless of where their expertise lies. The other added dimension or the added complexity in a primary school is the person who's leading might not be somebody that has an interest or a background in computing. When I taught and I led, I wanted to be I asked when this was available, put myself forward for it was the only person who put themselves forward to it. So it was by default. But it was something that I was passionate about. It was something that I wanted to get more and more involved in. But equally Becci and I are talking to people at the coming up to us at conferences, emailing us who have been given the position and don't know where to start. It's having that support as well and I think possibly a unique subject in that you get quite a wide spectrum of support that is needing. Part of the challenge for us is to make sure that we are inspiring people that need inspiring but also supporting people and making sure that we're aware of a digital divide. And although we want to show everything the technology can do and how great it can be to be mindful of the fact that it's not every for some people it's not their cup of tea, it's something that increases tension, it can make them quite anxious. But also we want to make sure that their pupils are experienced in a quality computing curriculum. So we need to work with those individuals and think how we best interact with them to give them the support that they need.
Becci Peters 10:56
I know I always used to get it when I was in school that you know, you'd be in an assembly and, you know, something went wrong with the tech. And it was always, oh, Becci will help, because she knows what, you know, she knows how to use a computer. And it's like, well, yeah, I probably do, but it's not actually part of my job. And then obviously, with the rise of AI schools that have chosen or appointed an AI lead, in most cases that is, you know, the computing subject lead or the head of department or something like that. Whereas, you know, occasionally it's not, but more often than not it's, oh, well, you, you know, you know about tech and you're probably interested in this, so you're the person that's going to kind of get that job. And as Ben said, you know, online safety is always something that ends up sitting somewhere within the computing department. Even though, as Ben said, it's a safeguarding thing, you know, it's taught in PSHE, it's not specific to the computing department. It should be covered by all teachers everywhere.
Ben Davies 11:47
There's one thing that I'd like to think that we offer and I think we try and pride ourselves on, is with computing at school, it's a community. So Becci mentioned about people uploading resources, but we are keeping abreast of what people are doing and seeing the people who are creating interesting stuff, seeing what members of the community are doing, and we're harnessing them. That's a great resource that you've shared. Would you come and talk about it? So teachers know that other teachers out there are doing things, are creating things, and they can be inspired by them. And that idea of going back to the early days of Twitter, when sharing is caring. If you've done something, if you have a resource that you think somebody in another school, regardless of where that school is, just giving people access because the reason that you created it was because of the learners that you're in front of. And if it's impacting on them, it could be impacting on learners in another school, regardless of where that school is or how the teacher decides to use it, just have it as a resource or as an idea or as some support that can be utilised.
Daniel Emmerson 13:01
And we keep hearing about support, I think, through this conversation so far. And we've also mentioned digital divide. The amount of support that's required, particularly as AI has become more mainstream in education, is only set to increase in terms of need. So I'm wondering, when thinking about that digital divide, what are you seeing in terms of patterns here? The types of school that might actively be looking for the resources that you provide. Are you having to make more of an effort to get out to schools? Because I imagine that headspace and thinking capacity around a lot of these subjects are the limitation for a lot of schools. They just don't have the time to engage with it. How are you addressing this, Becci, if I could maybe ask you, first of all.
Becci Peters 13:50
Yeah. So we did some research last year and we kind of surveyed over 5,000 teachers across the UK in secondary schools and sort of said, you know, what's your attitude towards AI? How is your school approaching it? You know, what are your thoughts? And so on. And, you know, a lot of schools were still saying, you know, they don't have an AI policy and they don't really know what to do about it. Some schools are buried their heads in the sand and just thinking of it'll go away soon, when I don't really think it will. So, you know, we're. We're kind of doing as much as we can to support those schools. So we've got an AI policy template on our website that schools can use, which kind of gets you to think about the different areas that are applicable to your school. There isn't one that's kind of like, you know, here's one that you can go and edit because it's going to be so different for every school. But the other thing that we're seeing is, you know, some schools are going, you know, all in, and the schools that have got the budgets, and this is predominantly private schools, you know, they're being able to, you know, buy, you know, the pro version of Copilot or whatever it is that they can then use with the staff and with the students. You've got some schools who are kind of buying some of these kind of wrapper apps like TeachMate AI, and then you've got some schools who just can't afford to do any of that and are just, you know, they're stuck with the free version. So when we've been kind of making our sort of short how to videos, we've been making a point of saying we're using the free versions here. We're going to show you what you can do without spending a penny because, you know, it's nice to have those extra things that cost money. But we appreciate that a lot of schools can't do that and we need to be providing support that everybody can access and as you mentioned, the resources and things, you know, some people have got, you know, the AI lead in their school is sufficient, you know, got sufficient knowledge to be able to plan the resources to teach the students the things that they need to know. But there's so many schools who, you know, the AI lead's been appointed or chosen because it's the head of computing, for example, and they're thinking, actually, I don't really know what I should be thinking, or things are changing so fast that they don't know what to do with it. So the moment we say to people, you know, either we're talking at a conference or, you know, we're just talking to our community members and saying, you know, we've got these resources and we've got different versions for different age groups and everyone, you know, especially if there's like a QR code on screen, you can just see everybody with their phones and they just want that straight away. And especially when it's stuff that they don't have to pay for and hopefully they think it's good quality. We do. So, you know, it's definitely needed.
Daniel Emmerson 16:12
Can you just unpack that a little bit more? Becci, in terms of, for audience perhaps that might not be aware, you talked about the pro versions of certain products, then the wraparounds that you mentioned and then the free versions. What are we talking about here in terms of differentiation of experience?
Becci Peters 16:29
Yeah, so you know, whether it's Copilot, Gemini, which is the Google one, ChatGPT, Claude, which is one that's not very well known, but one that, you know, the pro AI people sort of tend to use, they all have a free version. Most of the free versions, it's just due to you can only ask so many things in a day. Sometimes you'll get a slower response, sometimes you'll get not the latest model. So it's not the most up to date or perfect response, but they all do exactly what they need to do. Most of those you can pay for, you know, a pro account so that you don't have a limit on how many things you can ask in a day. And you get the latest models and the latest add ons that has been given and they're obviously getting updated all the time. So some people use them. You know, it partly depends on how often you're using AI as to whether or not it's worth paying for those. And then this, as I say, the wrapper app. So these are basically websites, sometimes apps that are built on, for example ChatGPT, but they're done in a way that makes it easier to use and the kind of, the prompting is done for you and you don't see that and you just see this lovely interface that gives you suggestions of things that you can do. Whereas even if you're going into the free version of say, ChatGPT, you're just given a box to type in and nothing else and you're like, where do I start? So the wrapper apps are great for those people who are starting out and they just don't know what to do with it. And it can give them ideas of, you know, how things that you can use it to do for you. But then kind of, once you've kind of mastered that, I personally think it's better to go into the free versions or the pro versions if you can afford it and you're using it enough and then, you know, you'll know how to write the prompts. And that's one of the things that we kind of go through in our little videos, is these are the sort of prompts that you want to use and how you can tweak them for your situation.
Daniel Emmerson 18:21
And is that something that's. Or that should be perhaps covered more in terms of curriculum and thinking about how AI is explored in schools? Ben, I'd love to know your thoughts on this, particularly around prompt engineering.
Ben Davies 18:33
Well, I think difficult position because I'm sort of primary based. And so the caveat here that if it's primary age children and up until sort of, it'll be Year 8, Becci in second year, would that be right? Is with the generative AI tools, pupils need to be over the age of 13 to be using them. So whilst some of the resources that we have introduce the knowledge of what generative AI is and try and explain in some way by children acting out and acting the model themselves, a technique or a style of teaching that's called unplugged where the pupils sort of assume the role of the computer in ways and act things out. So thinking about the processes. There's that sort of understanding that we tend to stick to machine learning, where we can control the data and the data stays within a controlled environment when we're certainly the examples that we provide for primary schools. But I think it's crucial, it's so important that people understand what it is. Teachers, adults, children, it's all about. For me, it's about making an informed decision about using AI. It's understanding how you interact with it. With the prompts themselves and the wraparounds that work on OpenAI that Becci was talking about, you can almost see them as the training wheels, the stabilizers on a bike. But ideally when every parent sets their child up a stable, it wants that child to be able to ride a bike. And there's that progression. I think those tools have allowed lots of people to interact with AI without needing to have that much of an understanding of how it works or how to prompt engineer. And I think it's important in the way that we don't teach children in school how to use a specific coding language or a certain presentation tool. We teach them the underlying things of how to make a good presentation, how to view it as an audience, how to think about your audience, or in terms of coding, what's the construct that's being used? How are you going to apply computational thinking to that? In the same way we'd want people to think about AI in an informed way and think, well, even down to the thing is, is AI the right tool here to be using? So we're seeing now the children are using AI as a search engine and just putting it because it's easier. They get one response back. But from our point of view, actually you have no idea what data that has been taken from. Whereas we would be speaking to learners saying, well, if you use a search engine, you've got 25 responses here. And for each of those is telling you where that information is coming from so you can cross reference to see. And it's having those skills that we think underpin lots of that's what we need to be teaching about in school so the students can make their own choice. And I suppose a lot of this I think, links in with e-safety in a way that a lot of the stuff that we do in schools about e-safety is to allow the pupils hopefully to be aware of the potential, to be aware of the fantastic things that can happen, but also to be aware of the dangers and to have the skill set, the tools, the knowledge to minimise the impact that the dangers can have on them and know what to do if they're in a situation where they're looking dangerous, where things are getting dangerous. And the same way with AI, we want them to be informed so they can make the right decisions about them, so they can understand bias, so they know about training data, they know about facial recognition. They are not surprised to find out that depending on the color of their skin, that their face is less likely to be recognized by some video recognition software. And that's because of the models that it's been trained on. And I sort of think we do a lot of good work, teachers about the whole idea of making sure that children see themselves in representations of themselves that if scientist or author, but just any child in the classroom can look up and see themselves as one of those people, yet they could use an image search and type something in and have every stereotype that we've battled against for the last 20, 30 years reinforced, put in successful scientist. It's likely. I mean, I'm not going to say, but you can fill in the blanks of what you would imagine the person to look like. And I think teachers need to know that. And I think pupils need to know, actually. Well, that's because of the data that it's been trained on. So they can challenge it and they can think about. Well, actually, even if it's making an informed choice about what platform they use, or they're questioning to find out, well, where, which. And I know this is blue sky ideology, someone's sitting on their high horse. But that's really what we want. We want them to be informed. We want them to be able to make decisions so the AI isn't seen as this magical thing that sits in a black box that you don't know the secrets of, but it's something that you can think, no, I understand how it works, and I'm going to use it the way that I want to use it, and I'm going to make decisions about which platform I use because I understand how it works.
Daniel Emmerson 24:27
Do you think, Becci, that we've got to a point at the moment where the majority of the teachers or schools that you're working with are thinking about AI from the perspectives that Ben was referring to here about ethical use, about data bias? Or are we still at a point where people are overwhelmed by what the technology can do and perhaps are a little frightened even to use it.
Becci Peters 24:51
Based on the research that we did last year that I mentioned? You know, the vast majority of teachers are not using it. They don't know what to do. They've not had the training, and they know that the kids probably know more than them. You know, I think for most people, and this is definitely not everybody, most people are still thinking, oh, my kids are cheating with their homework with AI, and that that's still their mindset, even though we're, what, two years down the line now since the big release? So there's definitely a lot of work that needs to be done. And I'd like to think that the resources that we've made, you know, start to help people in having those conversations, and even if the teachers necessarily aren't thinking about those things themselves by using the resources and having those discussions with their classes in an age appropriate way, they can start to have those conversations with their kids and they can kind of all learn together.
Daniel Emmerson 25:41
And how much of this do you think sits at a senior leadership level where decisions are being made about what teachers can and can't do with AI versus a personal approach that a teacher might take to using the technology?
Becci Peters 25:53
I think that every school and every MAT needs to have a person who is, is the AI lead that that needs to exist and ideally that should be somebody on senior leadership. I think that there's a lot of things that could be decided as like a whole school approach. So for example, most schools are either a Microsoft or a Google school these days. So a lot of schools have gone down the route of. Right, well we're a Microsoft school so we're going to use Copilot or the vice versa for Google. And I think that, you know, having an approach that says we would recommend that you use this platform is one thing. And you know, we're only allowing students in school to use this platform I think is fine. I think dictating to staff you can only use these platforms is different. You know, if a teacher has found a platform and they want to use it for their, you know, their lesson planning or you know, whatever other work that they're doing. And I think as long as the teachers have been trained, you know, to look for the inaccuracies in the data and the biases Ben mentioned, then they should be free to choose their own ones as long as they're aware of the risks of doing so. And I know that some multi academy trusts their AI lead will, you know, they'll look at all the different options and say we would highly recommend that you use this one, but we appreciate you might also want to use this, this and this, but please don't use this because you know, we've done the research and we've looked into it and we don't trust the information that it's giving out.
Daniel Emmerson 27:17
Perhaps for the people in schools that are making those decisions. Ben, even at primary, how much do you think the school needs to be thinking about, as you referred to earlier, the training of language models, where the data is going, particularly if a school is using free versions.
Ben Davies 27:35
I think it's really important and I think what schools need to be is informed to make these decisions. I worry that some schools are vulnerable to being sold things because it has AI. And Becci and I have been at conferences this year and everything. If it's not AI, it's powered by AI. I always wonder. Every stall had some reference to AI because it just felt like from their point of view, if they weren't mentioning it, then they weren't on the same playing field as other people. So there's a lot of. I'm not saying mis-selling, but possibly bending or artistic license about how much AI is being used or if it's creative messaging. Yes, yes. If it's OpenAI and if some of this could be done without actually using the platform in the first place. So I think, yeah, having a team who investigate what you use, but certainly investing in some training, whether it's coming on our training, but just finding out. So teachers, SLT members are in a position where they are making a decision because they know what they're looking for, because they know about the ethics, because they know about the training data, because they know about bias, because those are part of the things that they are thinking about and sort of looping back in on what Becci said about we'd like you to use this platform, but you can use this, this and this. If the staff understand, not to a granular level, but if they've had some training, if they know the basics of how it works, then they can transfer that to the different. We're not training them how to use Gemini, we're not training them how to use Claude, we're giving them an understanding of how generative AI works. And then when something new comes on the market, the next shiny new thing that everybody jumps towards, they're in a position where one, they understand how it works, but they know what questions to ask and how to think about is this the right thing to be using? As well as they understand prompt engineering, so they've got that literacy side, they know how to use it in. In that way. So I think I know from conversations with members of SLT, from schools, just through schools that I know, but also people that we met at conferences, emails, some schools are not doing anything because they don't want to do it wrong, or they see that it's a potential minefield and it's not so much a case of burying their head in the sand. Although one person at a conference did say to us, AI, it's just a new VR, it'll be over and done with in 18 months, that sort of thing. They could differ on that one. But the idea of people know that something could go wrong and actually it's probably from their point of view, it's really better to drag your heels and not commit and therefore not make the mistake and end up sort of the publicity for the school, the negative publicity that would bring rather than taking the plunge. And I think at CAS, what we are trying to provide and trying to do is educate people in the possibilities in terms of the impact it can have on teachers workload and what to do. Also making sure that people don't see AI as cheating because some people are allowed to write reports on AI, other schools aren't. Even in some schools, some teachers are allowed to use AI to write because they know what a good report looks like because they've written them for the last 10 years. Whereas people who've just started aren't allowed to use it because, well, how they don't have the critical evaluation when it comes back to say that's not right or to make those tweaks to it. So there's sort of interesting conversations there, but I think it's all about knowledge and that knowledge being powered so people can make informed decisions. I think just looping back, I seem to say informed decisions lots, but that's where we don't want to say use this, use this, use this. We want to empower people to say we're going to use this because. And we've made this decision because we think it does this, this and this. And we know that we thought about this, this and this, or we didn't use that because this, this and this.
Daniel Emmerson 32:21
Becci, I'm just mindful of time, if that's okay. I did want to come to you though, and I was just going to go off the back of something that Ben said there with regards to the questions that teachers and senior leaders should be asking of the tools that they might be investing in. Have you got any, any pointers or any go tos for teachers when it comes to those questions and what they should be asking?
Becci Peters 32:45
Yeah, so obviously there's a saying, isn't there, that if, if you're not paying for a product, then you are the product. So if you are using one of these free versions, be very careful what information you're putting into it. I mean, you should be careful what information you put into any, you know, generative AI platform. But if you're, especially if you're not paying for it, then you're being very careful about what information you're putting into it because that will go into the training data. Whereas if you're paying for a pro version, some of them will not then be using that data to train the model further. So it's looking at the terms and conditions for each AI Platform of at what point is the information that I'm giving that platform going to be used to train the data? So that's one of the really important things to look for. And if you're not paying for it, it almost certainly is being used to train the data. So thinking carefully about which platform you're using, whether you're on the pro version or the free version, and what information you're putting into that, and obviously it goes without saying no confidential information should go into any generative AI platform at all. But you know, if you've got, you know, some student data with no names on it whatsoever and you want it to do some data analysis, you know, you definitely don't want that data to be going into the training model, then the Pro version is more likely, but not necessarily definitely going to be the better option. So having a look at the terms and conditions is definitely a way forward. And I've seen, you know, people sort of saying, oh, well, we're a Google school, so we're going to use, you know, Google Gemini in my school. And it's like, well, if you look at the terms and conditions, unless they've changed since we came back to work post Christmas, the situation was that the only way that you can have a 13 year old using Google Gemini is on a personal account. If they are using a work or school account, they can't use it till they're 18 and therefore that rules out practically all students. So, you know, the students aren't therefore then going to be using a personal Google account in school. So, you know, it's really looking at the terms and conditions and double checking, you know, what are the age ratings and thinking about who's going to use that and then thinking about what's going to happen with any information that I print out, they're the two things that I think people need to look for more than anything.
Daniel Emmerson 34:48
With that, I think we'll wrap up our conversation today. But Becci, Ben, it's wonderful to hear about the incredible work that you're doing for so many educators and so many schools around the world. Phenomenal work, phenomenal having you on Foundational Impact. Thank you ever so much for being a part of this episode.
Becci Peters 35:06
Thanks for having us. It's been great.
Ben Davies 35:07
Thank you. It's been an absolute pleasure.
Voiceover 35:10
That's it for this episode. Don't forget, the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here and we'll see you next time.