Episode 63
Slowing Down to Move Forward: AI Done Right in Schools
Brief description of the episode
Nathan Holbert, Associate Professor at Teachers College, Columbia University, explores the integration of AI and other technologies in education, urging educators to move beyond the ‘wow’ factor. He challenges the efficiency-driven mindset, emphasizing that AI should support—not replace—the social, cultural, and relational aspects of learning. The conversation highlights the importance of intentionality, aligning technology with pedagogical values, and ensuring AI enhances teachers’ ability to notice and respond to student learning patterns rather than merely automating tasks.
Key Takeaways:
- Learning is complex, social, and contextual. It involves interactions, relationships, and contextual understanding. Any AI solution should enhance (not replace) these human dynamics.
- AI tools, especially chatbots, tend to mirror a “question-and-answer” model of education — an old, inaccurate view that sees learning as simply transmitting knowledge from teacher to student.
- AI is frequently marketed as a way to make classrooms “more efficient,” but efficiency is often misinterpreted as speed. True efficiency should balance speed with meaningful learning outcomes. Simply delivering information faster doesn’t mean students are grasping or engaging with it deeply.
- AI tools may strip away the context, history, and politics behind knowledge, presenting facts in isolation. This undermines students’ ability to connect what they learn to real-world experiences, making education feel sterile and disconnected.
- AI chatbots may appear to allow students to ask any question, but in practice, these tools often nudge users toward pre-programmed answers or pathways. This limits genuine student agency and critical thinking, despite claims of “student-centered learning.
- AI should be designed with a strong foundation in established learning theories. Instead of treating education as a transactional process of delivering information, AI should support constructivist, inquiry-based, and social learning approaches.
- AI should be developed as a tool to assist educators in noticing learning patterns, identifying student needs, and enhancing instruction—not as a replacement for their expertise. The goal should be to empower teachers, not automate their roles.
- AI should be used to enhance student agency by allowing for open-ended exploration, creativity, and personalized learning pathways rather than forcing students into rigid, pre-programmed learning experiences.
- AI tools must retain the cultural, historical, and social contexts of knowledge rather than reducing information to isolated facts. This ensures students engage with learning in meaningful ways that connect to real-world experiences.
- AI tools should be designed to integrate smoothly into the existing practices, workflows, and values of schools rather than forcing institutions to change their methods abruptly. This helps ensure effective adoption and long-term use.
- EdTech solutions often fail (or get watered down to mundane uses) if they don’t embed seamlessly into a school’s existing culture. Leaders hear about “culture,” but may underestimate just how strongly these day-to-day norms resist change.
- AI is frequently praised for delivering speed or automation, but if it undercuts deeper learning or teacher-student relationships, it creates more problems than solutions. “Efficient” doesn’t always mean “better” in an educational context.
- A lot of AI hype revolves around frictionless knowledge delivery (“just ask the chatbot”), yet actual learning is far more complex. This episode reiterates that teachers and students need space to explore, dialogue, and contextualize information.
- Before piloting or purchasing a new AI tool, clarify your institution’s core learning goals and cultural values. Then check if the technology aligns or if it undermines those goals in practice.
- Focus on AI as a “second set of eyes” rather than a “teacher replacement.” Start small with tools that surface real-time insights for teachers (e.g., identifying student misconceptions) and let educators decide how best to act on that data.
- Involve teachers in product development or pilot phases. Their intimate knowledge of classroom reality is essential. This fosters buy-in and ensures the solution fits the actual teaching culture.
- Don’t roll out a tool district-wide without feedback loops. Schedule regular check-ins to evaluate if the tool is driving deeper engagement, meaningful learning outcomes, and cultural fit.
- AI-related PD shouldn’t just show “how to click buttons.” It should help teachers understand the rationale, best practices, and limitations of using AI so they can integrate it organically into their lessons.
- Evaluate success by looking at evidence of genuine student engagement, critical thinking, and collaboration, not just “time saved” or test-score bumps.
Stay informed.
Subscribe to receive the latest episode in your inbox.