Skip to content

The Philosophy of Education in the Age of AI

Return to all resources
May 15, 2026

Podcasts

Category Name

Quantum LeapEP

In this engaging interview, LightLeapAI’s Storyteller in Residence, Dr. Rod Berger, sits down with Priten Soundar-Shah — educator, entrepreneur, and author — to discuss his new book Ethical Ed Tech: How Educators Can Lead on AI and Digital Safety in K-12 (releasing May 19th). What begins as a conversation about the book’s opening line — “I worry” — unfolds into a wide-ranging dialogue on philosophy, institutional agency, the three purposes of education, and what it truly means to slow down in a world that won’t stop accelerating.

Key Topics Covered

  • “I Worry” — the choice to open with vulnerability: Why Priten chose to write as a citizen rather than an expert, and how transparency shapes the reader’s experience.
  • Let’s Talk — ending the book with an invitation: How the conclusion reflects the book’s core premise: not to prescribe answers, but to spark honest conversation.
  • Priten’s background: From founding an education nonprofit in high school, to philosophy at Harvard, to self-teaching code, to bridging humanities and technology.
  • Running fast to get others to slow down: The irony of being an urgency-driven person calling for intentional pause — and why the urgency is real.
  • Intentional vs. reactive speeding up: The shift from wanting educators to move faster with agency to recognizing that institutions are now moving fast without it.
  • The three buckets of education’s purpose: Economic preparedness, civic participation, and personal fulfillment — and why all three must be addressed together.
  • Ethical decision-making requires defining education first: You can’t determine what is ethical without first deciding who you’re being ethical toward and what you want for your students.
  • Mission statements vs. actual decisions: Why so many institutions operate under implicit assumptions that contradict their stated values.
  • The gamification of learning: How high-stakes measurement has displaced genuine curiosity, from reading comprehension to broader academic culture.
  • Backwards design in practice: The difference between defining what you want for students long-term versus optimizing for end-of-semester metrics.
  • Philosophy and political philosophy on reform: The North Star approach vs. incremental improvement — and why both have a role.
  • The liberal arts and democracy: Reclaiming the original meaning of “liberal” in liberal arts, and why civic education is foundational to solving today’s problems.

Guest

Priten Soundar-Shah is an educator, entrepreneur, writer, and civic advocate. He founded his first education nonprofit in high school and has since worked across nonprofit organizations, universities, and ed-tech ventures. He teaches part-time, runs a middle school civics camp, and writes on the intersection of philosophy, technology, and learning.

  • Website: priten.org
  • Book: Ethical Ed Tech: How Educators Can Lead on AI and Digital Safety in K-12 — available May 19th wherever books are sold
  • Book website: ethicaledtech.org
  • Newsletter & Podcast: Available at priten.org

Host

Dr. Rod Berger — Host of Quantum Leap in Higher Education, brought to you by N2N and Lightleap AI.

Memorable Quotes

“I worry.” — The opening of Priten’s book, chosen deliberately to speak human-to-human with educators.

“I’m running to get somebody else to stop. That’s what it feels like — I’m chasing somebody and trying to get them to slow down, and that requires partially catching up to them.” — Priten on the urgency behind his work.

“A breath is oftentimes very powerful. Am I asking the right question? Something as simple as that can really change what you do by tomorrow.” — Priten on intentionality in the AI moment.

“Until we fully define why we teach, what we want for our students — you can’t decide what is ethical without deciding who you’re acting ethically toward.” — Priten on the foundational flaw in ed-tech adoption.

“When I tell someone to slow down, it’s not from a place of ‘I don’t know what it feels like to want to speed up.’ It’s ‘I wish we could speed up, but I don’t think we can afford to right now.'”

Related Insights and Resources

Build Your Own AI Agent