Daniel Emmerson 00:02
Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a non-profit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world. This podcast series sets out to explore the trials and tribulations of building a non profit from the ground up while also investigating the changing world of technology in life, learning and work. As the CEO of ASCD, the Association for Supervision and Curriculum Development and ISTE, the International Society for Technology and Education, Richard Culatta is focused on creating the next generation of innovative learning leaders. Richard has also served as the Chief Innovation Officer of the State of Rhode Island and was appointed by President Barack Obama to lead the U.S. Department of Education's Office of Educational Technology. His book “Digital for Raising Kids to Thrive in an Online World” aims to help create conditions for healthy tech use at home and at school. Richard, it is such a privilege to have you with us on Foundational Impact. Thank you for being here.
I think first of all for our listeners benefit, it would be wonderful to get a bit of context from your side on your role both at ISTE and the work that you do with ASCD, just to tell us a bit about those organisations and how might your interest in AI fit into that?
Richard Culatta 01:45
Sure. So I am very lucky and honored to lead ISTE and ASCD, two really influential education organisations that have, that have come together to be one organization really focusing on helping to create an amazing learning experience for every student every day that they go to school. That's our mission. Our goal is to make sure learning feels awesome for every kid every time they go to school. And so that's the work that we're doing. We are, you know, we work in about 100 countries around the world in some way, shape or form and so we're very aware that there are different opportunities and different challenges that are faced by educators in different parts of the world. But we also can learn a lot from each other as we, as we share, as we share back and forth and we try to facilitate that as much as we possibly can.
Daniel Emmerson 02:38
How much of that is dedicated to, for example, professional development for teachers, particularly when thinking about new technology as well as things like global citizenship.
Richard Culatta 02:49
Yeah, that the vast majority of our work is really focusing on helping teachers, school teachers and leaders learn. We feel like we, if we can help increase capacity inspiration for educators and educational leaders. That's the greatest impact that we can have. So that's where we focus, and we do quite a bit of that around thoughtful use of technology, whether that is how to create a healthy condition for using technology in a school, or whether it's actually thinking about new technologies like AI themselves. Interestingly enough, we, I think, we certainly in the United States, maybe in other parts of the world, we're the largest provider of professional development around AI for educators. So we do a huge amount of training on how to think about AI. And the way we think about AI is quite different than, I think, a lot of other places. We are far less concerned with the functionality of different tools. It's fine, there's cool stuff you can do, but we are much more interested in thinking about what are the skills that young people need in order to thrive in an AI world in the future? How do they need to be? What are skills that they need to learn that are different than what their predecessors needed to learn in a world that was not infused with AI? That's really where we spend most of our time when we talk to leaders and teachers about AI.
Daniel Emmerson 04:14
What is the general makeup you think of those leaders or those educators who feel comfortable in this space already? What sort of school environments are they coming from? Or is it really, really diverse across the board?
Richard Culatta 04:28
Well, it is. It is diverse, depending, but. But I would say I would group it. I would group it into three buckets if we can. The first bucket is a group of educators that are still in the, you know, maybe if I wait long enough, this will just go away bucket. Right? We're just going. We're going to just avoid dealing with AI. Maybe. Maybe it's a fad, right? It's a fad like other things that we've seen. And it's not. Let's just say that this is not a fad. It is not going away. So that's one bucket that we have to deal with. The other is the group of educators who have realized that this is not a fad, realize that this is going to happen. These are in schools where there's some level of support for it from school leaders, but the AI is largely being used as kind of a glorified search engine. Right? Instead of searching for something on, you know, on Google or something else, it's going to be, you know, that we're just going to search on this, you know, fancy tool that's a little more conversational. Right. You know, fill out forms a little faster, maybe create a rubric a little faster. And again, that's fine. It's a step into it. The third bucket are the leaders and teachers who are really thinking about how do we fundamentally design schools differently for an AI world? And that's the really exciting part. That's where we start to start to say again, like what? What are things that we should be teaching now that we weren't teaching before? What are things that we should stop teaching? Because we could probably just hand it off to AI now and use our, you know, big human brains for other tasks. So our goal is to try to move as much of the education community globally towards that third bucket, towards really thinking about the structural changes that we need to have in learning in an AI world.
Daniel Emmerson 06:19
There are so many shifts that happen, that need to happen, I suppose, in order to get to that bucket, though, right? It's not just a case of learning about new technology and learning how that technology works. It's a complete shift in mindset in a lot of cases. So I'm wondering, Richard, how much time do you spend thinking about the role of the teacher when it comes to understanding future possibilities for teaching and learning?
Richard Culatta 06:43
Yeah, it's a great. I spent a lot of time thinking about this, as you correctly guessed. You know, there's these interesting balances. On one hand, I think we need to make sure educators have time to explore. I think in general, with all kinds of things, not just AI, but all kinds of things we don't give educators enough time to explore, and that's really, really valuable.
Daniel Emmerson 07:08
Are you talking about sandboxing there or are you talking about?
Richard Culatta 07:12
So some of it is. Some of it is just saying you have some time to try using some of these tools in ways that, you know, that may or may not work, and that's okay, right? Like, we are sort of always in production mode in education, and I think that we need to have some space where it's okay to try something that may not work, to learn where the value is. So ISTE and ASCD put out a guide recently on how to think about bringing AI into schools. It's really for school leaders thinking about bringing AI into schools. And one of the things that we said is that you need to have some time as school, the adults in the school, right, the school, teachers, leaders, community, to try some things out and then be able to report back on what's working and what's not and explore it together. And that has to be modeled by the school leader. So a good school leader will say, hey, I'm going to take some time. I'm going to block off some time. Maybe there was a faculty meeting that you were going to have. And you say instead of this two hour faculty meeting, we're going to use this as two hours to dive deep into what are some of the things that AI can do well and what it can't. And then we're going to share some of that out. Right? Just that, just showing that there's some safe space to think about AI as more than just a search engine is really, really valuable. So that's one of the things that we do. It sounds very simple, but just having some dedicated time that is sort of sanctioned, approved, allowed to be used for exploring what AI can do and can't do, I think is really critical, I would say. Another thing, though, is it's also important that there is some explicit training on what AI is. I do worry that still a large part of the education community doesn't really understand what AI is. We often say, when I'm presenting, I often say, AI is not magic, right? Magic is mysterious and uncontrollable. AI is neither of those things. And, you know, I think I so, Daniel, I'm a pilot, and so I, you know, I like to fly planes. And you know, there are some people who think planes fly by magic. And you know what? It is okay if you think a plane flies by magic. It defies gravity, after all. I can understand that. But I do not want to be designing a plane or flying a plane with somebody who thinks planes fly by magic. And in the same way, you know, it's okay if some people think that AI is magic, right? But educators cannot think that. It is very dangerous if the people that we're expecting to teach the next generation think that this tool is somehow magic. We need to know what it is and what it isn't, and therefore where it makes sense to use it and where it doesn't. There are some areas where AI performs far better than humans. It is designed the way it works. It will give better results. It is better at certain tasks than humans. We need to know what those are. There are some areas where humans, I believe, should have the monopoly on certain skills. They're things that we particularly do better. But most of the time when I walk into a school and I say, have you thought about what uniquely human skills are and what skills we should really be handing off to AI? Most of the time I get, you know, sort of a glazed over look because we haven't taken the time to even have that conversation.
Daniel Emmerson 10:21
But it's taking that time. I suppose that is such a big factor For a lot of schools, I mean, when it comes to, as I'm sure you're aware, right. Day to day issues around teacher retention or student behavior, or things like addiction and even self harm and safeguarding matters that take up the headspace, the time resource that teachers need in order to do a lot of that experimentation and work, in addition to everything else they have to do on a day to day in the classroom, how do you frame that in a way that senior leaders understand? Right. This is something I really need to make concrete time for.
Richard Culatta 11:00
Yeah. I mean there's two ways to look at it. One is the sort of, you know, emotional plea, which is like whether, whether you like it or not, this may be the most important skill that your kids have when they graduate. And so you gotta prioritise this. Right. That's one way to do it. But the other way that's a little easier to make the case for is to say, and in this particular tool's case, it also has the ability to buy back some of your time. So a lot of the type of activities that you're doing, summarising information, drafting emails, drafting rubrics, drafting assessments, lesson planning. Right. A lot of those very time consuming tasks if using AI appropriately can be reduced, therefore giving you some of the time back. And so I think it's both of those. It's like a, hey, you know, like it or not, this is one of the most important skills you're going to have to deal with. You got to prioritise it and guess what, as the cherry on top, you can actually get some time back by doing it effectively.
Daniel Emmerson 12:01
Where do you sit on that emotional piece? Right. That first piece that you mentioned there, that this is the skill that they will need moving forward. What's your view on that? And particularly thinking about that, when it comes to how rigid curriculum is in the US of course, but also in the UK and elsewhere around the world, where it's all about the exams and it's all about the structure and the rigidity of how we've been teaching and learning for the last 100, 200 years.
Richard Culatta 12:31
Yeah, yeah. This is a really tricky question and you're right. School systems, education systems anywhere, are known for their rigidity, particularly around curriculum. The UK I think even farther down that spectrum than other parts of the world. It's just one of the realities. There's pros and cons of all education systems. Great flexibility is not something that the UK system is known for. That makes it challenging. And so we're going to have to address some of these issues if we want our young people to remain competitive in a global world. So the other thing that is shifting with our global digital world is where talent comes from has been democratized in some really interesting ways. And so because of AI and just because of, in general, the increases that we've had in infrastructure, thanks largely to Covid, where it has been prioritised, right. There is an ability to identify talent and hire talent in parts of the world that were never really on the talent pipeline. And so what that means is there's a lot of great opportunity for countries that in the past sort of seemed like they were, they were just out of the game. Right. But what it also means is that for countries, and again, the UK and the US fall into this bucket, have maybe grown a little complacent, have sort of taken for granted the fact that we will always have job talent pipelines are now going to realise that if we don't start to innovate at some of the levels that other countries are innovating, we will find very quickly that opportunities that were once available to our students are no longer available. And so that, I hope, will bring some urgency to making some changes to the system.
Daniel Emmerson 14:29
When it comes to that student experience. I know ISTE in particular focuses a lot of energy towards digital citizenship and thinking about responsible use of technology. How far along are you in thinking about where AI fits into that conversation?
Richard Culatta 14:45
Digital citizenship is hugely important for us. It's one, personally, it's one of the things that I care most deeply about. I've recently written a book called “Digital for Good” that is just focused entirely on how do we help create a healthy tech use for young people. And I think some of the things that we. I talked a minute ago about our rush to increase access during COVID and I think that was great. Right. There's some real value that we got out of that increased connectivity, but in the fact that we sort of rushed to do it, we didn't pause and take the time to really think about setting up the conditions in the way that we probably should have. There are movements in schools now, right around the world that you'll hear to sort of ban access to technology, ban access to phones in schools. Right. And what's interesting is that in almost all of those cases, when you look at schools that have what I would call digital dysfunctions, they have the use of technology as distracting and problematic. If you look at them, what they have not done was taken the time to create healthy norms for tech use. They haven't created healthy conditions for when it's appropriate and what types of tools are appropriate to use. They've just sort of let technology come in haphazardly and then they go, oh, my gosh, the only thing we can do to get under control is just ban all technology. Right. History will show that banning the technology that kids need to have to be successful in their future is a terrible idea. But if you look at schools that have taken time and said, let's set norms, I used to be a teacher at the beginning of the school year, every year we'd say, let's talk about behavior norms for our class. If I didn't set behavior norms, it would be chaos. And I couldn't then go, oh, these kids, they're just crazy. All we can do is ban them from coming into my classroom. No, it's that we need healthy norms. And if schools say, hey, let's talk about what's appropriate, let's talk about in language that kids understand, involving kids in that process of deciding what healthy use of technology looks like, that's the foundation for creating healthy future digital citizens. If we aren't teaching those skills in school, frankly, I don't know where we're going to be teaching them. Right. Yes, some of these skills need to be taught at home, but there are certainly skills about using technology to support your learning and to support your curiosity that must be taught in our schools. And that's my biggest concern. Actually, it's the thing that probably most keeps me awake at night right now is that because we haven't set healthy tech use conditions, we feel like we're resorting to banning these tools that actually are critical for the future of our students.
Daniel Emmerson 17:25
I think that's definitely one side is the school's responsibility for the behavior, but there's also responsibility on the side of the people building this technology. Right. When it comes to creating a tool or a solution that's going to be used in a K12 setting, thinking about the guardrails that are in place, particularly when it comes to AI tools and solutions, what might you say to people who are building this technology, Richard, with a view to working with schools?
Richard Culatta 17:52
Yeah, 100% right on that. And we actually created. So just about two years ago, we created something that we call the ISTE Seal. And ISTE Seal is a review that products can come through. And we do a deep review into whether we think this is an app that is safe and appropriate to put in front of young people. And not just we think it's actually based on years of research. We have a deep research base that's involved with this. And part of that is because we've been far too haphazard with what we're willing to put in front of kids, right? We're like, oh, we got this cool, this cool app that looks really shiny. Let's buy it and put it in front of our kids. And that while we're criticizing education systems, the US has been particularly notorious for doing that. Oh, this seems really like a cool, flashy app. Let's buy it, right? And there hasn't been enough review of whether it makes sense. And so we are trying to shift that. We have a couple tools that we provide. One is again, I mentioned this ISTE Seal. We also have a free website called edtechindex.org edtechindex.org which lists all of the EdTech tools that we are aware of that schools are using. And it's changing and updating every day. But we try to keep on top of that and we share what we know about them. And so if we know a tool has demonstrated and they have to dem, they can't just say they've done it. They've demonstrated that they are meeting accessibility requirements, we will flag that. If they have demonstrated that they are based on solid research based pedagogy, we will flag that. And so our hope is that it can make it a little easier for schools. Right? Because it's a huge burden on schools to have to do all these reviews. They often don't have the expertise to know if an app is really accessible or if the data is private. Data privacy is a big one. Right? And so what we're trying to do is saying we will review them and we do this with partners. This is not just ASCD and ISTE. We have a bunch of partner organizations that we trust their methodology for reviewing apps and if they can show that an app has performed at a certain level, then we will flag that and hopefully make it a little easier for schools to choose apps that really deserve to be put in front of kids.
Daniel Emmerson 19:59
What about when it comes to the world of work and ASCD and ISTE? Are you using AI tools? Do you promote the use of AI tools? What does that look like on a professional level to you?
Richard Culatta 20:10
Yeah, absolutely. We look at this in two ways. Obviously one part of the world of work is the working of future educators. And so we're doing a lot of work with pre-service teachers. So helping them as they are individuals have chosen to go into education, helping them be able to think about what does AI and other technology look like? We have a coalition of over 100 of the top educator preparation programs that we're working with to try to redefine and prepare future educators. But then we also have this sort of broader workforce that people that are going into a whole variety of other fields. And what does this look like? We have an initiative that we call Skill Rise, where we're working at saying, what are some of the most critical technology skills? And AI is certainly part of it, but it's broader than AI. It's how do we think about using technology in ways that help us be more efficient and effective in our work? How are we helping people who may have been in roles that are now shifting dramatically? So to think about, there were roles or professions that for, you know, years, for generations, were sort of off limits to technology. If you want to get into the creative arts, My wife is a violinist. Right. If you were a violinist, if you went into music or music composition or music performance, you really didn't. You didn't really need to have much background in effective technology use. Right? Today, if you're going to be a musician, you absolutely have to know how to use technology to help compose and score, to help record, to help do collaborations virtually similar to how we're recording this, right. And so skills like basic computer science skills, skills like using AI, skills like understanding, you know, how to be creative and analytical, both in a digital world, those are skills that all careers need. And so that's really what we're pushing on. Before taking this role at ISTE and ASCD, I was the Chief Innovation Officer for the state of Rhode Island in the United States. And we decided that one of our key initiatives was going to be the first state, we're going to be the first state to teach computer science in every school. And we did that, by the way. And part of the way we did it is we went out and we talked to parents and we talked to teachers, and we talked to communities. And we said, in order to thrive in this future world, no matter what profession your kid goes into, right. CS is the language of future problem solving. And that was sort of pre AI, I would now say CS&AI is now the language of future problem solving. In any career, if you want to be able to solve a problem, you have to have the skills of using that technology. Think of any tough problem that we're dealing with, any problem in our community, in our world, at the root of it, the solution to any of those problems is going to involve smart use of technology. All of them. And so if we aren't teaching ourselves, either as young people or as professionals and out in the field, if we're not teaching ourselves how to use technologies in these ways, we are losing our ability to be problem solvers. And that's why it's so critical that as a workforce. Right, a workforce and a future workforce, as I like to think about it, it's so critical that we think about using technology for problem solving. We will be in a situation where we will have to outsource our problem solving to others if we aren't able to stay on top of how to use technology.
Daniel Emmerson 23:35
What might your advice be? Richard, as a final question, just for a teacher who understands that and grasps the implications of it, but is finding themselves at a school or an environment where they aren't supported in making the changes they feel need to happen.
Richard Culatta 23:53
Yeah. And that is unfortunately the case. Right. And that is largely why we exist. ISTE and ASCD exists as a nonprofit organisation to help be that community for educators if they aren't feeling it and even when they are feeling it at their local school. So we have a couple opportunities. Obviously we have a great membership program and educators can join and be part of a deep community. We have a whole variety of certification programs that happen in cohorts so you can engage with other educators. But I want to flag for people that might be listening to this, we have a tool, it's our online community that we call Connect ISTE and ASCD Connect. You can go to it, you can find it online, you can download an app for it. It is completely free. It is part of how we give back to the education world. And so it's a great ongoing conversation with really thoughtful educators from around the world about how they're using technology, how they're thinking about the future of learning. So please join in and be part of that conversation. If you are part of an education community where you're doing really cutting edge stuff, where you feel like you're really being supported. Awesome. Please join, please share what you're doing. If you're in an area where you feel like you don't have that support, please join and please get the support. We want to support you. We want you to feel like you're part of a broader community. So just go check out Connect. If you just search ISTE or ASCD Connect, you'll be able to find it and come be part of the conversation.
Daniel Emmerson 25:23
Fantastic, Richard, some awesome advice there. Super grateful for having you with us today on Foundational Impact. Really appreciate you being here and look forward to seeing what happens next.
Richard Culatta 25:33
Daniel, thank you so much for having me here. It was such a pleasure to talk with you and I really appreciate the great work you're doing.
Voiceover 25:39
That's it for this episode. Don't forget, the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here and we'll see you next time.