Daniel Emmerson 00:02
Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non profit perspective. My name is Daniel Emerson and I'm the Executive Director of Good Future Foundation, a non profit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world. This podcast series sets out to explore the trials and tribulations of building a. Nonprofit from the ground up, while also. Investigating the changing world of technology in life, learning and work. Laura Knight is founder of Sapio Ltd, a consultancy specializing in digital strategy and innovation. She was previously Director of Digital Learning at Berkhamsted School and worked as a teacher for 20 years across both the maintained and independent sectors. Laura is an expert in leading the strategic development of technology for teaching and learning, one to one devices, artificial intelligence, staff training in digital skills and online safeguarding. Laura is a disruptive thinker, international keynote speaker, coach and consultant and is passionate about creative problem solving. Laura supports leaders in schools, universities and commercial organizations with exploring the art of the possible and leverages technology to transform outcomes, systems and teams. Her new book, “The Little Guide for Teachers on Generative AI”, will be published by Sage in the autumn of 2024. Laura works with the Independent Schools Council Digital Advisory Group, the Edutech Europe Advisory Board and is a leading member of the BORN Epsom Protocol, a national cross sector working group on AI in education. She has also worked as an expert advisor to international governments on educational policy and provides training and input for associations such as the Independent Schools Association, the Independent Schools Teaching Induction Panel, the Society of Heads and the Independent Schools Bursars Association. Laura, it's amazing to have you with us on Foundational impact.
Laura, as you know, we are exploring the world of artificial intelligence in education through this podcast series and I can't think of a better person to have on the show than you actually, with the amazing work that you're doing. Laura, for the benefits of our audience, could you perhaps start by telling us a little bit about your journey to getting where you are in the AI and education space? That'll give us a good platform to leap from.
Laura Knight 02:50
Certainly I'd be delighted and thank you so much for having me here, Daniel. It's a pleasure to be with you. So I have a background in education and I was a teacher for 20 years until last year in fact. So I'm originally a linguist but was benevolently adopted by the computer science department at my last school. So I've taught children essentially for a very long time. And have really enjoyed working in education across both the maintained and the independent sectors the UK. I have worked in middle leadership extensively as a sort of head of department level, but then really enjoyed leading on digital at Berkhamstead School, which was my sort of final perch where I did lots of work on digital strategy, on online safeguarding and digital well being and on sort of one to one device strategy, sort of the pedagogy of digital teaching and learning, that kind of thing. And latterly of course, AI has been a part of that story. And interestingly, over the past few years, as sort of Chatgpt and its friends arrived in our awareness, there has been a hunger for insight and wisdom and understanding of the nuance around this technology, this sort of evolution of artificial intelligence in a way that is embedded in an understanding around what happens with teaching. So teachers are a wonderful and tricksy audience because we have a very high bar actually about being convinced on things. And I think that's exactly as it should be because we think very hard about what our students need from us. And so it's with great enthusiasm now that I have embarked on a new stage in my career where I work now in an advisory and consultative capacity supporting schools, leadership associations, training organizations and indeed the commercial sector. So I work with publishers and ed tech companies and all sorts of.
Daniel Emmerson 05:00
So you're right in the thick of it?
Laura Knight 05:02
Right in the thick of it, exactly. Doing lots of work, advisory work and inputs and practical work and training if you like, bit of everything. Thinking about digital strategy and innovation more holistically and thinking often very specifically about generative AI. So just before this call, I was working with a client who's in their school, they're struggling with some sixth form challenges with boundaries around the use of AI in terms of academic integrity. So we were just unpacking that and how we need to consider it with, in terms of policy, for example. So yeah, I bring educational, a depth of educational knowledge and experience in a sort of pedagogical pedigree if you like, but also a real technical understanding of what the business of teaching and learning needs from technology and a sense, if you like, of strategic direction about how to use technology well in the education space.
Daniel Emmerson 05:54
There are a couple of really interesting transitions there, Laura, first of all from your work as a linguist to a computer scientist and then from working in schools to the advisory and the consultancy space. Can we go into a little bit more depth with those? I'm really interested in the first one to begin with, just going from modern foreign languages to the computer science department. What were you able to take away from the world of linguistics and bring to computer science?
Laura Knight 06:24
That's such an interesting thing around learning, isn't it? So I'm really proud of being a lifelong learner. I'm geekily enthusiastic about learning new things and you only have to look at my audible playlists and stuff to see that. But I find that, and this is also something to do with my fairly neurodivergent profile, is that I find novelty of challenge and new learning to be core to my well being. So my career has been a succession of missions and projects and development. The idea of staying and doing the same thing year in year out of 20 years, back to back would be bananas and would not be good for me, I don't think. So, there is, I think, something really interesting around the intersections between languages and computer science, which is often to do with things like pattern recognition and creative problem solving, around attention to detail. And sort of being able to reverse engineer something or being able to work back through and spot errors. There's a lot of commonality there. But also actually, when I'm standing there delivering some work on Python to Year 8, it's enormously similar to delivering some work on how I might teach a grammar point in French or in Spanish. So actually I think there's much more commonality to that than you might expect. And I actually would say that there's a really interesting pathway for linguists who are looking to diversify to explore computer science, because there is a lot there for them in terms of neural pathways that are already very definitely well carved.
Daniel Emmerson 07:54
Where did that inspiration come from to move across? I mean, you said you were benevolently adopted by the computer science department.
Laura Knight 08:02
Well, I would say necessity is the mother of invention on that one. And as is, I think, a very common story in this country and beyond, there is a dearth of high quality computer scientists like that. We just don't have vast numbers of them. And in the sort of the first part of this, the 21st century, obviously of the governmental changes and gove's enthusiasm around computer science and the shift away from ICT, which was often taught by a ragtag band of folks who kind of could manage their way through Microsoft, PowerPoint, you know, and an organised nice poster, were actually challenged in new ways. And of course, so many of our primary and prep school colleagues worked their socks off on upskilling for that new computing curriculum in a way that I think has rarely been adequately recognised because my goodness me, what a, what a step change for them. But it left a gap in so many seniors and secondary schools where there just aren't very many people with that skill set and that background. And we know, don't we, that there is an economic problem here. If you're a computer science graduate of any calibre and you want to go get a job, if you're in any way motivated by money, goodness me, you're not going to go anywhere near education, are you, when you could earn five times as much in a more corporate environment. So yes, necessity is the mother of invention. And I was really proud to be, you know, be able to take, first of all, to take ICT actually as a department into its own departmental identity. So we built that first and then from there we're able to establish a Key Stage 4 and a Key Stage 5 program with computer science, which is, I think, a story that is so important for the economic well being of the country. We cannot just permanently outsource these things, especially given our political, I suppose, temptation to become more isolationist when it comes to brains of the country and inputs from abroad. But I also feel like there is so many opportunities available to youngsters in terms of breadth. Actually, when you're looking at the academic profile of young people and the options that we give them, there is more to life than just one set of approaches to what academic success looks like. And I think if we've learned anything from dot com onwards, that there is a future for the geeks of this world that can be beautiful and fulfilling and there should be no apology for that. So it's absolutely right that there should be a wonderful curriculum opportunity for them.
Daniel Emmerson 10:39
At what point then? And all admiration for that. And as a fellow geek, you know that that's music. Laura, at what point would you say that generative AI started to encroach upon your thinking and your practice when teaching computer science?
Laura Knight 11:01
Oh, interestingly, it wasn't via computer science that it arrived in my world. So I make it my business to kind of stay up on the latest news from a technology perspective and have done. Oh, goodness me, I was, I was deep in Twitter from sort of 2009 and like that's been part of my intellectual and social landscape for a really long time and the networks I've built. So I was interested in that from the moment we started hearing about ChatGPT becoming available for public testing. Right. That was like, oh, oh, this is interesting. There's a lot of hype building and there were quite some interesting questions actually right from that early stage about who owns this. Like, is this computer science, is it like social media that sort of wafts about in the, in the digital ether somewhere and we don't quite know who it belongs to? Is this an academic question or is it a commercial one? Is it like Bitcoin and something that nobody will ever really understand here and we just know that it happens outside of our world. At that point, it was quite tricky to tell. And so from a computer science perspective at that early stage, we didn't have a sense, first of all, I think, about the coding possibilities that were sort of available, would be available in the very near future. And it was much more about, well, how can we make this useful? And that really makes me quite itchy as an educationalist because my view is always that the technological tale should not wag the educational dog. And I find it very uncomfortable when there are lots of folks jumping on a bandwagon and saying, here's a really cool thing, let's now try and find some retrofitted use cases for it. Because quite often those don't come with any kind of pedagogical grounding or strategic motivation. It's just like jazz hands evangelism. That's not got the depth and substance I want, frankly. Where's the rigor, friends? So I'm really interested in when we started to evolve the conversation. And it took six months a year, right, to evolve the conversation out of those boring binaries of either we've got to be doom and gloom, it's the end of the world and we're all going to just become thoughtless blobs. And at the other end of the spectrum, as I say, that kind of blindly enthusiastic, all accepting evangelism, which again is not useful. And where we started to evolve the conversation to understand the nuances in the space in between. And that our job in education is to hold those difficult tensions between it both being, as you've heard me speak about this, Daniel, about both being an opportunity and a threat, about it being both a way to encourage and support academic success, but also to undermine it and cut it off at the knees. So we need to be able to see all of those nuances and to be able to navigate a path through them not only for our own professional well being and the integrity of the establishments and organizations and institutions that we work in, but also to model that never has our pillar of the community status mattered so much, actually, because it's for us in education to lead the experience of the youngsters that we work with so that they can understand how to thrive in a world which is going to be and is already massively disrupted by this technology.
Daniel Emmerson 14:25
And at what point then did you decide or did you start to think about, well, perhaps working with schools almost exclusively in this area and having, I guess, broader coverage in this space. Right. Is going to be the right trajectory for you as opposed to, for example, working in a school.
Laura Knight 14:43
So I think one of the things that I had noticed, which had been a trend over a little while, is that there is nothing quite like walking the walk to attract interesting questions. So if you are visible in a space in, for example, an educational landscape where you speak at conferences and you network with people and you say, actually, hey, we're doing some quite good work on this, not just we're talking about the possibilities, we're actually doing something good. I think I had been very fortunate and very blessed to be able to build a reputation for, let's say, having a bias for action and for delivering high quality outputs in terms of strategic decision making. Like all the stuff we did on one to one devices. And you know, I didn't just say, hey, there's some issues with social media and online safety. I did a whole master's dissertation about it and then talked extensively about our program around digital wellbeing at Berkhamsted. Like this was very part of the rigor of. It was very part of what I wanted to do. And so it kind of became quite organically normal for folks to just arrive in my inbox or to get in touch and say, hey, we're looking for some help. Could we come and visit? Could we have an hour on the phone with you? Could you help us with this? Could you come and talk to our staff? And actually that sort of organic snowballing happened. And I've never been shy about sharing because I feel very passionate about the importance of community and networking and education and how we are better together fundamentally. Like the generosity of spirit that I have always had, even long before I started my own business around sharing good practice for the betterment of children everywhere. God, that sounds pretentious, but yeah, you know what? I like it comes from a good place is what I'm trying to say. I felt even then that if we can get, if I can do some good work and it can have bigger impact and help make things better more broadly without being too much sort of an idealistic, naive sort of soul, I felt like that was a worthwhile thing to do. And also it brings me great intellectual joy to be able to come up with connections and solutions that perhaps other folks haven't done. Now, I want to, in parenthesis, mention my neurodivergence at this point. There are lots of things in my life I find quite difficult as an autistic woman, a late diagnosed autistic woman, too. But one of the things that I find easy is understanding the complexity of a problem and then being able to have deeper empathy for the people experiencing it and find solutions out of it. It's one of the gifts of my particular sort of autism. And it seems to be something that folks find valuable. And for that I am very grateful and always humble.
Daniel Emmerson 17:28
What about. I'm thinking about your journey there, Laura, and how initially you were getting emails and people were contacting you specifically about better understanding these, these conversations and what's happening with the evolution of artificial intelligence tools and technology. How would you say those initial interactions compare to where we are today? Did you see a shift at any point in terms of the level of depth, understanding? What does that journey look like for you?
Laura Knight 18:04
So it's, it's really interesting here, right? Yeah, it's more, it's more than that now. And I think that there is, there has been a huge evolution in terms of the types of questions that are being asked, the sorts of practical applications that people are looking for, and a sense of growing understanding, actually, because I think it's very understandable that in the first instance there was a combination of denial and fear around generative AI. There were a whole raft of people who were saying, oh, it's just some nonsense fad. And it'll be, you know, it's like those sort of 3D curved TVs. People have forgotten about them in six months time. You know, there was, there was some of that and, but actually there was also a lot of fear. And we saw, for example, what happened in Italy with the complete kind of, you know, blocking and banning. And this also was landing in a, I suppose, a landscape, a digital landscape, which is fraught with quite a lot of ideological battles, I think it's fair to say. Just now we have the up in arms WhatsApp groups full of people trying to ban mobile phones altogether from children's lives. You know, the better late than never online safety bill. But lots of people arguing that it hadn't gone far enough. Lots of conversations around, when is this 18 and over pornography legislation ever going to arrive? You know, there's been a whole series of conflicts around young people, digital life, the chasm between parents and schools that has been created around mobile phones and the messy landscape in between, which has tried to be filled very, you know, understandably, by all sorts of different organizations and charities and people trying to do the best that they can do. But it's a really challenging space. And because when the minute that you start telling parents how to parent their children, you have all of the colours of the rainbow in terms of responses. So there will be at one end of the spectrum, folks who are saying, you're not doing enough, and at the other end of the spectrum, how dare you even consider doing so much? And so it's very difficult to navigate that in among all of the budget challenges, the staffing, retention issues, the busy curriculum, the pastoral care, the safeguarding, the busy life of school. And so it's a complex space that is evolving. People are frightened of things moving fast. It has already moved very fast. And so there's also a sense of fear around, well, when is the point at which I jump on this particular train that seems to be just accelerating? And I don't feel confident about getting to know a technology, that the sands are shifting so much. Certainly the questions are evolving away from some of those old binaries that I mentioned and more into the nuance, which pleases me greatly, and also into applications and practicalities. There are now so many more useful conversations about how different segments of the school can benefit. How do we harness opportunities for streamlining, for effectiveness, for reducing duplication, for considering how we think about data. I've had some wonderful conversations about data that I never would have expected to have three years ago. I mean, how often do you talk about data lakes with schools like, that's before, that was not a thing, was it? But now when we're talking about how data is almost the currency of the school, it's the lifeblood of decision making and insight, and the more that we can use generative AI and other AI tools and adaptive learning processes and all of this to really benefit the members of the learning community by reducing the bias and taking out some of the finger in the wind sort of judgments that might have been made before. I think that's incredibly powerful. So, yes, we're seeing more about that. I'm seeing more application of specific use cases. So we want to talk about safeguarding and well being. We want to talk about administration, marketing, admissions, communications. There are groups who are saying, what can we do from a senior leadership perspective to help optimize things like record keeping and appraisal processes and target setting. I think that's where we were arriving in the conversations I'm having now. And I'm delighted because that for me is understanding that there are challenges working within them and then looking to build constructive solutions that actively solve a problem that the school has rather than tail and dog problems.
Daniel Emmerson 22:48
Do you think there's a difference then between the conversations or the level of depth in those conversations? For example, in the uk, the fee paying sector versus the non fee paying sector? And are those differences in conversation, say from 18 months ago happening exclusively in one setting over another, or is it a little more nuanced than that?
Laura Knight 23:12
I think it's a lot more nuanced than that and I think it's naive actually to assume that any particular group of schools is ahead or behind here. So to give you an example, in the last, let me say, I would say in the last month, let's put it that way, I have spoken to colleagues teaching in a school that has 70% free school meals in one of the poorest areas of Birmingham and in one of the top 10 most expensive schools in the country. And they're both at exactly the same point in their thinking and we're asking exactly the same sort of excellent questions about the best uses of this technology for their school community. So I don't, I don't think, I think this is all about attitude of leadership, about engagement in the concept of innovation and embracing opportunity for positive change and seeing a chance to use technology as a lever for school improvement rather than, you know, some sort of decorative, nice to have luxury thing. Now, now, you and I have had some great conversations before about capacity and this I think is a really interesting thing. There are plenty of schools of all sorts of different types, I would say, whose agendas and priorities will not align with something that has technology in it. Right now, if attendance is your biggest worry or you've got major behavior issues around the physical environment in school, it's not surprising that AI will not be your first port of call as a solution. And I don't think that schools should feel bad about that. They need to attend to their context and the priorities within it. What I do hope, and I hope that the work that we're doing with Good Future and all of that, I think what I do hope is that there will be an increasing agility and an increasing fluency from school leadership to know, to have the confidence to know that where there is a particular set of challenges that can have an effective response provided by generative AI or indeed that whole umbrella of technologies, that they know where to go to ask. I talk a lot about choosing the right tool for the job and how in the same way that you wouldn't choose to, you know, if you're building an IKEA wardrobe, you don't reach for a chainsaw. There will be plenty of times when the right tool isn't generative AI. Like if it's not the right thing, don't force it, but there will be times when it is the right tool and that what we want to do is give people the skill, the knowledge, the understanding and the judgment. Most importantly, the judgment to know how to best meet the needs of the school community and the priorities at hand.
Daniel Emmerson 25:57
Thinking about those tools though, right? We talked about different areas of school life where you might reach for generative AI as a possible solution. When it comes to time saving or productivity or looking at data, the number of tools or quote unquote solutions in this space is increasing substantially on an almost daily basis. By the look of things, as a teacher, particularly as a teacher or a school leader in an environment that is impossibly demanding, where do you look or where do you begin to look in finding what best practice might look like, how to experiment, perhaps with one of these tools, but responsibly and thinking about data privacy, security and safeguarding all at the same time.
Laura Knight 26:52
It’s a lot of things, isn’t it? There is a lot to think about, right, because we've got just first of all, quality of output. Like, first of all, do we. Do we actually respect what the thing turns out and are we happy with that? That's a very low bar. Then we need to add into the mix all of those questions around data and privacy, security and those issues that are key to our particularly we're thinking about working with a school. Never mind user interface and simplicity and all of those sorts of things. There's all sorts of different functionalities too. And increasingly we're also seeing tools that are embedded within other tools. So you'll have something that may have been familiar, that now has AI kind of built into it in some way that may or may not be obvious or transparent to the user, or may just look like a jazzy new kind of twiddle on the logo with very little clarity about what the actual functionality is. And this is one of the challenges inherent to working with generative AI is, in terms of transparency and explainability. It's really hard. It's really hard, especially when a lot of the language revolves around mysticism and magic, which I'm going to argue is unhelpful. It's unhelpful because it absolves the human of responsibility. If you can blame it on the pixies and the fairies and the goblins. Right. And this is very human to ascribe magical characteristics to things we don't understand. And it's also, honestly, it's not very sexy to say I was using this really big statistical engine that's very good at making predictions. And so we have to have a little bit of critical thought in our own minds as users when we're thinking about this from an educational perspective as teachers especially, like, what is it that we're choosing and how is it working and how is it trained and where's our data going? We need to ask big questions. And a lot of folks do not have the media literacy skills, the technology literacy skills to do that effectively. So I think one of the useful things to do is to look to role models and peer networks and support groups, and particularly nonprofits, particularly charities, particularly those who have nothing to gain whatsoever other than love for the sector when they're sharing good practice, because it's so important for us to find the good filters so the people who filter out some of that stuff and share the good stuff and try them all out and give you the best one. That's wonderful. However, I would urge caution that abundance isn't always helpful. Abundance can actually be a real problem here. And I talk about cornflakes with stuff when I talk to teachers about this. And I say, you know, when you go to the big Tesco's and you want a box of cornflakes, you can stand in front of the cereal aisle and there's literally like 18 types of cornflakes. And that decision takes a long time. If you walk into Aldi or Lidl and there's one box, there is one type of cornflakes, your decision is very straightforward. Off you go, job done. And it might not be the most glamorous, but just that decision fatigue aspect is reduced for you, and you know it's going to do the job. So sometimes we need a little more Aldi and a little less Big Tesco's in our approach to these things. And we need to find people who will do that filtering for us and say, here's the workhorse tool that will do the job just fine. You don't need the high glamour, you know, £45 a month thing. Just use this one. And sometimes I think that's a really helpful way in.
Daniel Emmerson 30:24
And how have you approached that? Because full transparency. I have not read your book yet.
Laura Knight 30:28
Oh, that's all right. Because nobody has. Nobody has. But would you like an advanced copy? I can see where I can still.
Daniel Emmerson 30:34
I would love an advanced copy. Depending on when this podcast is released, it might be out in the world, ready for reading. Laura, could you just sort of tease a few bits and pieces from generative AI in the classroom?
Laura Knight 30:49
Yeah. So I was really honored when Sage, the publishers got in touch with me and said look, we feel really strongly that there's an opportunity to add to that they wanted to add to their series, their Little Guy for the Teacher series, which are designed to be accessible and short, like maximum 12,000 words, like short and really affordable as a form of CPD that everyone can reach. If you like something that you can dip into and just read a chapter as sort of on its own and it's got some real actionable inputs in it. So some opportunities for reflection, some experiments to try, some tips and tricks like really non academic, not highbrow, and most importantly, not at all technical. So this is not written for people with high technical confidence. This is written for everyone who's a classroom teacher from, you know, early years in reception to sixth form colleges and beyond, like, but who would love to have some insight into how they could use generative AI in their practice, but how they can evolve their thinking so they can make good decisions, exercise good judgment and understand how to enmesh this new technology into their existing practice and help it to be them, but better not to undermine or replace them in a way that isn't helpful. So it's got lots of practical stuff, it's got some thinking about teaching and learning, some thinking about sort of adaptive teaching and SEND. Some conversations around things like creativity, which I think is really important. And also some thinking about the challenges and how we reflect on academic integrity and we think about assessment, task design, and some of those things too.
Daniel Emmerson 32:24
Wholeheartedly encourage folks to scope this out. If you're a teacher in the classroom or anyone with an interest in AI and teaching and learning, this has got to be the place to go. Laura, I'm incredibly grateful for your time as I'm sure our listeners today. Thank you so much for sharing, for your insights and for your passion for this work. It's incredible to be able to speak with you. Thank you so much for being here.
Laura Knight 32:51
Thank you.
Voiceover 32:52
That's it for this episode. Don't forget, the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here and we'll see you next time.