I haven’t written a lot about AI here at Eduwonk. And where I have it’s more questions than answers. That’s where we are on the tech right now. And I haven’t done that thing where I have AI write the post and that’s the big reveal. AI did not write this. My colleagues Amy Chen Kulesa and Alex Spurrier recently wrote a great piece -again, actually wrote it – about AI and schools for Fordham.

AI will impact the education sector, as it will most walks of life, and it will over-promise and under-deliver. That’s always a safe bet. Yet the velocity around AI is intense right now. Here are a few things to pay attention to:

1) I’m old school around teaching and learning, and so is the human mind. At some level the core technology of teaching and learning has not changed that much since Plato sat with Socrates, no matter how much we might wish otherwise or tech enthusiasts might try to convince us. Deeper learning, 21st Century Skills, etc…etc…the teacher learner relationship and content and knowledge are what matters. AI is already setting off a new round of people of people putting “mere” in front of “facts,” (which is truly astounding given the times we live in). Don’t fall for it.

As with previous versions of ed tech, the most transformative applications may be around support and tools for teachers, students, and families and embedding AI in existing tools. PowerSchool just rolled out their AI embedded product. Look for more of that rather than blue sky ideas.

Yet when you made this point about ed tech – that the biggest impacts might not be instructional applications – it upset the enthusiasts. Seems like the same thing is happening again. Lesson planning, data, system analytics and predictive analytics, coaching, tutoring, those seem like promising applications more than letting an AI app just teach the kids. And of course the basic equity question remains – the affluent will make sure their kids get the richest instruction. The job of policymakers and education leaders is to ensure everyone has access to that kind of instruction.

I get asked a lot, will AI result in fewer teachers? Yeah, seems like it might result in fewer adults overall, including some instructional roles. You can certainly see some productivity enhancements. But teachers aren’t going anywhere, for good instructional reasons and because they’re powerful in the political process.

2) Similar to the point above, Technology can enhance teaching and learning and the personalized applications of AI are exciting. And it can make teachers jobs easier. But AI doesn’t fundamentally change how we learn. Shoutout to Ben Riley who is trying to put a dent in this problem and increase understanding on the consumer side but the bottom line is you should listen as much to Dan Willingham as you do Sal Khan about what’s desirable here or how it fits with what we know about learning.

3) Yes, of course there is a gold rush. And that’s not all bad. Innovation costs money, and at the end of the day a lot of these solutions will, of course, come from the private sector. But it’s not as much of a gold rush as it might seem. The last big ed tech bubble it was astounding what was getting funded. It doesn’t seem that frothy now. Investors say they are hearing a lot of pitches, but they are moving deliberately. Rather than bubbles, again innovation costs money, if you want to complain about something complain about schools not being savvy consumers – that’s the real problem. A few years ago Curriculum Associates CEO Rob Waldron literally made a video for districts saying don’t do these things. People still do them. AI can create a nice image of leading a horse to water but…

Also pay attention to the role of free and open products. In general this part of the sector underwhelms, this technology could be different given the nature of it as a platform for enhancements. And keep an eye on policymakers, speaking of gold rushes there is one on for jurisdiction. The White House it taking a more aggressive stance via its executive orders, led by staff who think policy was behind the ball on social media and other tech, and the Trump Administration had a reasonably well-regarded AI policy. The Hill and regulatory bodies are all trying to carve out influence.

4) Don’t confuse policy and regulatory fights with land grabs. For years you had big companies battling for market share in schools but doing it via various claims about privacy and data. A lot of folks were happy to join in, perhaps not even realizing they were picking sides in a larger fight for market share. You’re starting to see the same thing with AI as various vendors, especially large incumbents, try to figure out how to use regulatory power or policy to fence off competition. Parse claims beneath the label, figure out what’s really at stake.

5) Bias is a problem with AI, address it but don’t over-index on it. AI learns from data and if those data have problems then it will show up in AI. And some of the data does have issues, many things are keyed to medians that exclude a lot of users – this shows up around gender a lot but is also an issue with race. One need not be a woke scold to realize that there is bias all around us. As with the regulatory point above, some of these issues will be used in bad faith and weaponized ways to score points rather than solve problems. AI adherents must be attentive to addressing bias – and the good news is there are increasingly tools and practices for doing so – but it can’t be conversation stopper and the education sector has not been particularly sophisticated when it comes to thinking about bias.

6) Prepare to be surprised. This is a novel technology that even its boosters don’t entirely understand. There are some broad contours, sure, but anyone telling you they know exactly how this will or won’t go is selling you something. It’s still dancing baby time in terms of where this technology might go. Pay attention, consume information broadly, and enjoy the ride.

If you want to get Eduwonk.com in your inbox when it’s published you can sign up for free here.

End