Spencer, John. “Using AI to Spark Deeper Learning.” Spencer Education, 29 Oct. 2025, spencereducation.com/ai-deeper-learning-update/.
AI represents both a dream and a warning for education. On one hand, it offers personalized feedback, instant support, and creative possibilities that can make learning more dynamic than ever. On the other, it tempts us toward speed over depth, efficiency over understanding, and convenience over curiosity. The real opportunity lies in the space between utopia and dystopia, where teachers and students use AI not to replace human thinking but to extend it. When we design with intention, AI becomes a catalyst for deeper learning rather than a shortcut around it.
If you enjoy this blog but you’d like to listen to it on the go, just click on the audio below or subscribe via iTunes/Apple Podcasts (ideal for iOS users) or Google Play and Stitcher (ideal for Android users).
One of my favorite social studies writing prompts requires two separate days to complete. On the first day, students design their own utopia. They begin with the question, “What is the ideal society? What values should it be built upon?” From there, they dream up this perfect situation. Students complete a concept map connecting every aspect of society, from transportation to education to energy to entertainment to city planning. Next, they describe a day in the life of this dream society using the five senses to create greater imagery.
The next day, they begin the second half of the writing prompt. This utopia has become a dystopia and the dream is now a nightmare. What happened? What is it like to live in this society? What were the hidden weaknesses of the systems, structures, and even values embedded in that initial idea? What happens when we take a good idea and move it to an illogical extreme?
x
While I designed this as a social studies thought experiment, I actually think we can apply the same process to our dream scenarios in education. In many ways, AI represents a kind of educational utopia: instant feedback, personalized instruction, limitless access to information. But just like those student-created worlds, our idealized version of AI has cracks beneath the surface that we need to explore.
At the same time, we tend to have a glut of dystopian stories about the dangers of AI. From The Terminator to The Matrix to Ex Machina, our culture treats artificial intelligence as an inevitable descent into a dystopian future. Machines rise up, humans lose control, and the technology we created to make life easier ends up erasing what makes us human.
In this vision, schools become untouched sanctuaries, sealed off from algorithms and automation. Yet even that version can slip toward its own dystopia: one where students are unprepared for the world they actually live in, where fear replaces curiosity, and where learning becomes nostalgic rather than transformative.
I mention this because I recently led an all day AI workshop with secondary teachers. They actually spent twenty minutes in teams creating their dream scenario (I didn’t call it a utopia) with AI based on their previous beliefs about rejecting or embracing AI. Afterward, they analyzed what might cause their dream to become a nightmare.
Together, we discussed how the real opportunity isn’t about creating a perfect system powered by AI. It’s about asking deeper questions: How might AI make learning more human? How might it help students think more deeply, not just work more efficiently? In other words, we were able to discuss through the lens of story, how we might avoid the two extremes of Lock It and Block It and Techno-Futurism.
The question isn’t whether AI will shape learning, but how deeply it will shape thinking. Between the dream of effortless perfection and the fear of total collapse lies a quieter, more human possibility. It’s the chance, albeit messy, to use AI as a tool for deeper learning. The challenge for educators is to design learning experiences where AI doesn’t replace curiosity or struggle but helps students go further into both.
We live in an era of rapid change. Technology is moving at a breakneck speed and our lives feel more crowded than ever. We can access rapid fire video on multiple apps and stream on demand shows on multiple platforms. It’s no surprise, then, that the number one thing I hear when I ask something how they are doing is the short answer, “busy.”
This hurried pace is a broader cultural trend, but teachers are feeling it more acutely than many other professions. Every shift in technology, every new expectation, and every “time-saving” innovation seems to add another layer of urgency instead of relief. The role of the teacher continues to expand beyond teaching and learning into zones once owned by specialists (just look at the pushback I got in this Facebook post for suggesting that secondary content experts might not need to be experts in early literacy), counselors, curriculum designers, etc. We keep piling more onto teachers’ plates. New standards for the already packed curriculum map. New unfunded policies and initiatives. Even the most dedicated educators feel stretched to the edge of what’s possible.
So, given this reality, many teachers have looked to AI as a time-saving device. That rubric that once took an hour to make? I can revise an AI-generated rubric in five minutes. That newsletter that used to take two hours to write? Let’s use AI to craft a new one that can then be differentiated for parental preferences (a short, “just the facts” version, a typical version, and a detailed version). That station rotation activity? Let’s design it complete with handouts, instructions, and leveled readers in half an hour.
Students are feeling this same pressure to move faster. Many of them spend hours completing homework while participating in regimented after school activities. Suddenly, AI arrives as a way to save time and take shortcuts. On the surface, that makes sense. AI tools offer instant feedback, personalized support, and quick answers that can genuinely help students learn more efficiently. And when the curriculum map moves quickly, this instant feedback and quick tutoring can make a huge difference.
But there is also a cost. When students rely on AI to do the hard thinking for them, they miss the slow struggle that builds understanding. Over time, that convenience can lead to cognitive atrophy, where curiosity fades and learning becomes something to complete rather than explore. In its worst iterations, students end up using AI to do all of the thinking and the work for them. This cognitive off-loading can erode thinking over time.
However, we can also take a different approach that focuses on deeper learning. Here, students engage in deeper problem-solving that leads to mastery. They focus on meaning rather than speed and output. They engage in deeper curiosity and critical thinking. In other words, they use AI in a way that gives them the depth advantage in life:
So as we think about integrating AI into our schools at the student level, we might want to start with the question, “How does this promote deeper learning?”
I am sitting here staring at the cursor blinking back at me in silent judgment at my slow, foggy brain and my clunky human hands. It is 2:56 pm. In three hours, I’ll be teaching my assessment course. But for now, I am sluggish. One part circadian rhythm. Another part being a desert boy in a cold and gray climate. But mostly? I stayed up until midnight watching an exciting baseball game.
You know who doesn’t get tired? Generative AI. It might gobble up electricity and use a fair share of water (though this is actually a really complicated environmental issue that goes beyond the binaries of good and bad). So, right now, I am tempted to pass the baton from my hands to a complex algorithm built on predictive analytics.
But I don’t.
Part of it is my love of writing. I love the process even when I don’t enjoy it. I find value in the process even when it moves slowly. I define myself as a writer and I am convinced that part of being a writer is continuing when you are tired. It’s slowing down and being deliberate with words. It’s the process of learning through writing rather than just writing after you’ve learned. I don’t want to lose any of that even if it saves time and energy.
On some level, though, we all need to negotiate what we gain and what we lose by using AI. For some people, the idea of AI jump cuts (video editing) takes out the slow, methodical eye that videographers need as they engage in film production. To others, it’s an unpleasant task that they can outsource to a machine. For some, AI-generated spreadsheet codes allow them to analyze data faster. For spreadsheet nerds (the self-proclaimed “freaks in the sheets”) that takes the joy out of making spreadsheets.
As a classroom teacher, you might love to make slideshows, design leveled readers, or write a newsletter. But you might just want to spend more time in small groups or engaging in professional learning. You might hate data analysis and love the idea of an AI tool that helps you get real-time data to pull students for intervention. Or that might just be something you love to do in order to get a sense of how each student is doing.
In other words, it is idiosyncratic. We all decide when and how we want to use AI tools. But we do need to recognize that we lose something human every time we use AI.
This is why it helps to take what Ethan Mollick calls the “cyborg” approach, where we treat AI as a form of co-agency. We use the AI but we modify what it creates based on our own voice, contextual understanding, and knowledge of our students.
I use the metaphor of ice cream. AI creates the vanilla but we add our own unique flavor.
Co-agency means we are not outsourcing our thinking but partnering with it. Mollick describes two main approaches to this relationship. The centaur approach treats AI like a teammate. The human and the machine divide tasks based on their strengths, alternating between intuition and computation.
The cyborg approach goes a step further. It is fully integrated, where human and machine thinking blend in real time. In this model, AI does not replace creativity or judgment. Instead, it extends it. We remain the authors, the composers, the teachers, but with an extra layer of insight or speed or perspective. The challenge is to stay grounded in our humanity while still exploring what this partnership can make possible.
Deeper learning with AI requires intentional friction in this co-agency process. It involves doing unpleasant but important tasks. It focuses less on efficiency and more on extended learning. It means designing moments where students wrestle with ideas instead of rushing to answers. When AI can instantly summarize, explain, or generate, the teacher’s role shifts from delivering information to creating conditions for curiosity and productive struggle.
Students might use AI to test their understanding, to compare interpretations, or to revise their reasoning, but they still have to think, question, and decide. In this way, AI becomes less of a shortcut and more of a mirror that reflects how well students can apply, adapt, and extend what they know. But this requires us to view AI as a thinking partner rather than an efficiency machine.
I really struggle with the idea of a AI as a thinking partner because machines don’t think. They generate scripts. They aren’t intelligent. They simply generate responses using predictive analytics. True, an LLM mimics the human brain but it is not sentient, intelligent, or affective. AI cannot understand anything. It can generate questions but it cannot become curious. It doesn’t wonder. So, in a sense, AI is a tool and not a thinking partner.
Yet there is still value in using the metaphor of a thinking partner, because it captures how AI can extend and challenge human reasoning even without genuine understanding. When we treat AI as a thinking partner, we are really using it as a reflective surface that pushes our thinking forward. The goal is not to share cognition with a machine but to use its responses as prompts for deeper human thought.
When students use AI as a thought partner, they can improve their metacognition. They can determine what they know, what they don’t know, and what they need to do next. Along the way, they can move through this metacognition cycle:
Moreover, AI can actually strengthen student agency when students use it in a slower, more intentional way. They can ask better questions, explore new ideas, and take more ownership of what and how they learn. Here they move from compliance and engagement and toward deeper empowerment.
A student might use AI to clarify a confusing concept, plan a project timeline, or get feedback before revising their work.
The teacher’s role is to guide those choices, not to script them.
When students learn to use AI as a tool for curiosity and reflection, they begin to see themselves as active creators rather than passive receivers of knowledge.
In other words, students can use AI as a partner in curiosity or as a coach for reflection.
So, let’s explore a few practical ways students might use AI for deeper learning tasks.
AI can become a powerful tool for inquiry when students use it to spark curiosity instead of just finding quick answers. It often starts by asking questions and follow-up questions. Students can move in a slower, more methodical way by using the FACTS Cycle for prompt engineering.
Here, students might ask questions that help clarify a misconception they have about a mathematical concept. They could interview a fictional historical figure, like a woman who lived in ancient Egypt. Students might even use an AI chatbot as the starting place for a Wonder Day project.
But that’s only one element of inquiry. If students want to develop deeper curiosity with AI, they can ask a chatbot to challenge their assumptions, offer counterarguments, or reveal perspectives they might not have considered. Here, it becomes a way to explore ideas more deeply.
At the same time, students also need to spend time analyzing the answers that generative AI produces. While LLMs continue to improve, they will still create hallucinations (inaccurate information that it produces based on bad “guessing”). Information will continue to be biased as well. This is part of why students need to explore the nature of AI rather than just focusing on how to use it.
When students use AI to wonder rather than to finish, it turns learning into an open conversation instead of a checklist.
One of my favorite uses of AI is as a study aid. Instead of having students re-read or underline the text (which fails to improve metacognition or lead to deeper retention of material), students focus on having the AI test them on the material. Here’s how the process works. Note that this still incorporates a human element built on personal agency:
In the future, students will need to solve what are called “wicked problems.” These are the types of problems where the solutions are imperfect and often lead to new problems. These problems require human wisdom and deeper contextual understanding. They often involve empathy as well. In other words, they are the kinds of problems that require human skills in the age of smart machines.
Here’s where AI can help with the problem-solving process by running simulations that allow students to test ideas. In science, students can explore what happens when they adjust one variable in an ecosystem or change the rate of carbon emissions. They get to see the ripple effects and talk about why it happened, which builds real understanding instead of memorized facts.
In social studies, AI simulations can drop students into complex situations like peace negotiations or debates over economic policy. They can see how one decision affects another and how competing interests collide. This kind of work helps them think like historians and civic leaders rather than test-takers.
In math and engineering, students can use AI to design bridges, rockets, or roller coasters and test how small design changes affect outcomes. They learn through trial and error, reflection, and revision. Across subjects, AI simulations make problem solving more active and give students a safe place to explore, fail, and try again.
We are still in the early stages of these deeper simulations. However, this is one of the rare moments where we use AI in a way that moves past integration and into true transformation. See the continuum below.
As mentioned earlier, the student use of AI should actually improve rather than reduce metacognition. One way to do this is to use AI as a metacognitive coach. Students start by sharing an initial goal that they have and then ask AI for feedback on that goal using a SMART Goal approach. Generative AI tends to do really well in estimating how realistic a goal is, especially in terms of time frame. As humans, we get pulled into the planning fallacy, where we overestimate how much we will accomplish in a given time. But generative AI tends to do well in making accurate time estimations based on larger predictive trends. Similarly, students can use AI to help break down goals into smaller tasks and action plans. Then, over time, the AI takes on the role of a coach asking reflection questions that lead to deeper reflection.
Another option might be for students to ask for specific feedback. If we think about our receptivity to feedback, we tend to react emotionally to feedback based on whether we expected it, whether the feedback can lead to extrinsic punishment or rewards (impacting our grades, for example), or whether we asked for it.
When feedback is unexpected and unsolicited, most people get angry.
When it is expected but unsolicited, we get anxious.
When it is expected and solicited, we tend to embrace it more enthusiastically.
So, students can ask for specific feedback on their work or their ideas and they can actually ask specific questions that lead to action. They might create a set of fictional avatars that offer immediate feedback. They can then engage in a fictional conversation with the avatars where they ask for feedback that is both positive and critical.
Subscribe to my newsletter and get the A Beginner’s Guide to Artificial Intelligence in the Education.
You can also check out other articles, videos, and podcasts in my AI for Education Hub.
Join over 90,000 educators who receive teacher-tested tools, fresh ideas, and thought-provoking articles every week straight to your inbox.Subscribe
My goal is simple. I want to make something each day. Sometimes I make things. Sometimes I make a difference. On a good day, I get to do both. More about me.
on the uploaded document.Logging in, please wait... 
0 General Document comments
0 Sentence and Paragraph comments
0 Image and Video comments
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
First look: how you’re responding as a reader
Want me to go on to the next part, where we pull out key sentences and talk about why they matter?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Zooming in on what you just said
Want to see the next piece, where we start pulling out specific sentences from the AI passage and connect them to this idea of “the space between”?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Framing this as a “both/and,” not an “either/or”
Want to go on to the next paragraph, where we pull out a first key sentence from the passage and think about why it matters?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Pulling out the first key sentence
Want to see the next paragraph with another key sentence and why it’s doing important work in the passage?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Another key line about what AI makes possible
Want to see the next piece, where we zoom in on the “warning” side of the paragraph and the tradeoffs it is pointing to?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
New Conversation
Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Sitting with your pushback on “more dynamic than ever”
Want to see the next paragraph, where we zoom in on the “warning” side of the passage and connect it to exactly this tension you’re naming between promise and proof?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Turning toward the “warning” side of the paragraph
Want to see the next part, where we pull in the “space between utopia and dystopia” line and connect it to your skepticism about the hype?
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
You are reading this piece, right—not drafting it yourself?
That opening really does walk a tightrope: it names the hope and the threat without sounding panicky or hyped. What do you notice about how the writer sets up those opposites (dream/warning, speed/depth, etc.) to shape how you enter the rest of the article?
Do you want to hear more?
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
First reaction
Want to see more thoughts, especially on style, grammar options, and voice choices here?
You can revise or edit your writing right now. If this is your document, you can make changes right there in the left pane.
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
You’ve set up such a rich, layered experience here—the slow build over two days, the big essential question, the concept map, then the sensory writing. It invites students to think like philosophers, city planners, and storytellers all at once. What do you most want students to walk away with from this prompt: deeper content understanding, stronger writing, or a chance to imagine freedom and possibility beyond the world they know?
Would you like to see more?
New Conversation
Hide Full Comment
I am wondering if this teacher ever had students take the good ideas and go to an extreme on purpose? I know how students start to spiral. Or was this what the teacher really wanted?
New Conversation
Hide Full Comment
My AI concern is that we are pushing students further and further from face-to-face interactions. Many adults sit behind computer screens and meetings are on ZOOM. People forget how to have conversations, share ideas, and debate without everything dissolving into argument and blame.
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Big picture of what he is arguing
Want to go deeper into how he does this rhetorically and who he seems to be talking to?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Rhetorical moves you could notice next
Want to see a next layer that focuses on audience, tone, and how you might respond to this writer in your own voice?
You can revise or edit your writing right now. If this is your document, you can make changes right there in the left pane.
New Conversation
Hide Full Comment
Getting students to stop thinking of AI as an answer machine, but as a tool. I feel like these are the same conversations that teachers had about using calculators.
New Conversation
Hide Full Comment
do humans make them think, though. The calculations of the input???
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
New Conversation
New Conversation
New Conversation
New Conversation
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
New Conversation
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Dig into the core idea
Want to see more ways to break this paragraph open and really work with its meaning?
New Conversation
Hide Full Comment
Think of this image as a spectrum or path you can move along, not a fixed label.
- On the far left is teacher-centered / compliance
– Students mostly follow directions, complete tasks, and “do school.”
- In the middle is engagement
– Students are interested and participating, but the teacher still makes most of the key decisions.
- On the right is student-centered / empowerment
– Students have real agency: they help shape the goals, the methods, and even the assessment of their learning.
So the “way” to take this image is to ask:
1. Where am I (or my class) right now on this line—compliance, engagement, or empowerment?
2. Where do I *want to be?*
3. *What concrete moves would shift us one step to the right?**
– e.g., giving choices, co-creating norms, letting students design projects, reflecting on their own learning.
If you tell me your role (student, teacher, tutor, etc.) and your context, I can help you map out specific next steps to move from compliance → engagement → empowerment.
New Conversation
Hide Full Comment
We do. – All together
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
This image lays out a spectrum of “student agency” that runs from teacher-centered to student-centered, with three stages along the way: compliance, engagement, and empowerment.
What stands out to you first: the words (compliance / engagement / empowerment), or the idea of teacher-centered vs. student-centered?
Do you want to talk about what each word means in your own classroom/setting, or more about the overall message of the graphic?
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Big idea of the image
Want to see more about what each word on that line (compliance, engagement, empowerment) is really saying about classroom power and voice?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Nice insight you already have
Want to dig into how you might put that overlap idea into your own words so it shows what you think about power and voice in classrooms?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Direct answers to your two questions
Want to see more about how you could write your stance so it names when limited compliance makes sense, when it becomes harmful, and how you want to rebalance toward engagement/empowerment?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
How this connects to AI
Want to see more about how you could name specific classroom moves where AI either shuts down student agency or expands it?
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
A professor talked about a young relative of hers who went to apply for a job as a journalist. When he said that he would not touch AI, they said, don’t apply. There’s more.
New Conversation
Hide Full Comment
They do.= Students only
New Conversation
Hide Full Comment
New Conversation
I do. = Teacher
New Conversation
Hide Full Comment
New Conversation
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
First connection: What “wicked problems” are really about
Want to see more connections to AI specifically?
New Conversation
Hide Full Comment
I have seen what Peter is doing in his class and with the after school college essay help using AI, and I think this is a good use for it. But I am also concerned that a student may not react well to some feedback. I think we need to help the students process feedback as well as use the technology.
New Conversation
Hide Full Comment
General Document Comments 0