The Staggering Promise of AI Tutors

Chatbots can help students understand concepts when their teachers can't.

It’s election day, but I’ve already voted, and already set out the case for choosing liberalism. So, to avoid political doomscrolling, today’s piece is about something other than politics—and something that is, I hope, cause for optimism. Namely, how emerging technologies can dramatically improve the lives of underprivileged kids.

This is a story about artificial intelligence, and specifically large language models (LLMs), the technology behind ChatGPT, Claude, Gemini, and so on. Lots of people are very down on LLMs, and they’re typically down on them for one or both of two reasons:

  1. LLMs, regardless of their usefulness, are bad because they are trained on stolen data, take remunerative work away from people who need it, destroy creativity, or other moral concerns.

  2. LLMs, regardless of whether they are bad, are useless. They solve no problems, have no utility, and never will.

I’m unpersuaded by (1), but it’s not my focus here. Instead, I want to tackle (2) by talking about how, last week, I watched what I think will be one of the most consequential uses of this technology—and one of the most extraordinarily positive.

My son is in sixth grade. His science class is studying light and specifically reflection and refraction. This is relatively complex stuff, bound up in a bunch of unfamiliar terminology. It’s exactly the kind of topic a lot of kids his age need help understanding.

He and I had played around before with Google Gemini’s natural conversation mode. There are similar features in other chatbots, and the basic idea is that you’re interacting with in not through text, but speech. Further, the bot is tuned to speak in a natural way, handle interruptions well, and the voice is particularly convincing. It really does feel, in the moment, like talking to another person on speakerphone.

My son asked if he could use Gemini to help him with his homework, which involved answering a bunch of reflection and refraction related questions. I was happy to give it a go. So I opened the Gemini app on my phone, put it in conversation mode, and set the phone on the table in front of him. The first question had a term he couldn’t quite remember the meaning of, so he asked. Gemini kicked back an answer, but it was a bit more detailed than he was looking for, pitched at closer to an undergraduate level. So he said, “Can you tell me that again, but appropriate for a sixth grader?” And Gemini adjusted, including coming up with a clear and helpful analogy involving tennis balls.

As the exchanges went on, it quickly became clear that he wasn’t using Gemini to just give him the answers. Besides not helping him learn, that would just be cheating on the homework. Instead, this felt like a tutoring session, like sitting down with a teacher and working through the concepts in the homework until he understood them well enough to answer the questions himself. And because Gemini has memory of the whole conversation, as well as the kinds of questions he asked and where his knowledge gaps seemed to be, it was able to give him answers keyed in on filling those gaps, and all at a level that worked for him. That tennis ball example appeared again and again, too, getting updated to each new concept, and building a metaphorical framework for understanding.

It was, frankly, astonishing. And the whole time, I was thinking, “This is going to change kids’ lives.”

Here was an on-demand, always available, and free (or quite inexpensive) tutor with wide knowledge of the kinds of topics covered in middle school, infinite patience, and high adaptability. It was a fun and helpful game for my son to get to play with it, but imagine what this can do for kids who don’t have the resources he can draw on? Lots of kids need this sort of back-and-forth instruction, but can’t get it, because their teachers don’t have the time in the classroom, the child can’t stay after school, the parents work too much to provide it themselves, the child is embarrassed to seek help, the teacher just isn’t very good, etc., etc. For kids like that—for kids who could succeed in school, and get the life benefits such success brings, but who can’t access the help they need to achieve it—being able to talk to Gemini could very well be life-changing.

Still, we might have some reasons for skepticism. So here are what I see as the most likely objections to wider use of AI tutors.

“Isn’t it your job as a parent to help?” Yes. And in this case, I have the luxury of both time to help, and sufficient knowledge to do so. But I’m not every parent, and not every parent shares those luxuries. So to say, “This is the parents’ job” means, for a lot of kids, “You don’t get help.” At the very least, AI tutors are better than the nothing, or close to it, many kids unfortunately have as the alternative at home.

“AI makes things up. Won’t students risk getting taught incorrect information?” There are two responses to this. First, teachers make mistakes, too. My kids routinely tell me things they’ve learned from their teachers that are, in fact, quite inaccurate. So the question isn’t “Are AI tutors perfectly accurate at all times?” but “Are they at least as accurate as the alternative any given child has access to?” That leads into the second response: Yes, they’re quite accurate. LLMs are trained on vast quantities of texts, and because of the way they work in a predictive fashion, are more likely to be accurate when talking about topics there’s a ton of information out there on, and which they don’t need to delve into weird edge cases regarding. This means that I wouldn’t recommend a graduate student rely on an AI tutor to learn about obscure interpretations of Hegel. But a middle schooler learning the basics of refraction is on pretty solid ground.

“Isn’t this just yet another example of computers taking away jobs?“ I can imagine, on the margins, this might cut into the business of actual tutors. But most kids’ parents never pay for them to see an actual tutor, either because of the cost or because of the time commitment. This means that actual tutors weren’t getting their business anyway. So, again, we’re not in a situation where the alternatives any given child faces are “use an AI tutor” or “use a human tutor.” They’re instead “use an AI tutor” or “have no tutor.”

“A computer can’t know the student the way a good teacher can, and so how can it meet the student’s specific learning needs?” Again, consider alternatives. First, as I discussed above, it’s really quite remarkable how good the AI is at zeroing in on areas of understanding and adapting its answers to the student. In fact, one of the reasons LLM chatbots are in many cases more persuasive than humans in talking people out of believing conspiracy theories is that the technology so quickly and precisely adapts the bot’s answers and questions to the particular person it’s “speaking” to. Second, go back to the question of “Compared to what?” A thoughtful teacher with time to tutor can do wonders. But many kids don’t have access to that. And it’s very clear AI tutors are much better than nothing.

I find this technology—and this clear use for it—so exciting because it promises to give children a new way to access one of our most valuable resources we can give them: education. The fact is, plenty of kids, around the world, are in situations where learning is challenging because they can’t access the kind of help they need to succeed. But this technology, and the ubiquity of cell phones, means many of them now can. And the technology is only going to get better. Imagine the possibilities.

If you enjoyed this post, sign up to get all my new posts sent to your email. You can also add my RSS feed to your favorite feed reader, or follow me on Threads and Bluesky.

Reply

or to participate.