AI is coming for education. Here's how parents and educators should think about it
There are some very human things kids need to thrive.
Photo: Jacob Padilla for Unsplash
Whether you like it or not, artificial intelligence will play an increasingly large role in both learning and educating. We’re early enough in this seismic change that we can advocate for and design technology to help our kids become better thinkers and better humans. But how do we, as parents and educators, think about the issue?
The research and writing of The Disengaged Teen helped us think about what kids need most right now—in school, at home and in life. Months of talks with students, teachers, parents, entrepreneurs and leaders on the back of the publication of the book confirms for me that we need to focus on the very human things kids need in order to learn and thrive, and design relentlessly for those. They are attention, connection and agency.
Specifically:
The development of sustained attention, which requires struggle
The ability to connect (to one’s self and to others), which requires being in human relationships
Practicing agency: the ability to set meaningful goals and develop strategies to meet them.
Rebecca and I recently spent a week talking about the book at the #ASUGSV2025 conference in San Diego, a massive annual gathering of education technologists (from companies to investors, educators to funders, and many more.) The conversation was dominated by AI: its promise for learning and its potential perils. (There was also, per usual, lot of hype.) This year, the conference was subtitled Learning at the Speed of Light.*
I found myself reassured by some people’s messages and the tech they were developing around kids. I also found myself deeply worried.
Attention and thinking
Young people at this moment in history — with all its epic challenges from climate to tech to the future of democracy — need better critical thinking skills than any previous generation. But there’s a real risk that AI could diminish these skills rather than enhance them.
Why? Because thinking and learning are hard. As Daniel Willingham, author of Why Don't Students Like School?: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom writes, “Thinking is the hardest work there is, which is the probable reason why so few people engage in it.” According to the Wall Street Journal, one New Jersey high school student used generative AI to “cheat her way through” her junior year. She passed all her classes and was caught a grand total of once.
Learning requires friction, and much of tech looks to remove friction. The more AI solves for kids, the less kids have to solve for themselves. Kids do not learn at the speed of light.* A lot of AI technology innovation is aimed at making thinking easier: I am worse at reading maps than I used to be, and at tapping into my sense of direction. I don’t mind; maybe I don’t need map reading skills. But I do need thinking skills.
I think of the issue kids are facing as the Blank Page Problem. Looking at a blank page is actually part of learning to write. It sucks. It’s hard. I am all for new and creative ways to get help, but humans look to make things easier, and if AI is there to solve the Blank Page Problem, kids will take it. When AI writes all of our first drafts, we will lose our ability to face that blank page. This very clearly risks hurting creativity, hurting thinking, and impairing the development of good writing skills. The Blank Page Problem can, of course, be adapted to almost any skill.
Edtech insiders developing tools obviously know this challenge. “AI in the workplace is an assistant, making things easier for you. AI in learning is a tutor, not an assistant. The only way it can work is to not give you all it has,” said Caleb Hicks from SchoolAI on a panel at the conference. In other words: How do you hold the AI back, getting it to be a coach but not to give the answers away?
When designing AI for education, we need to keep this point front and centre: The audience here is young humans in formation, not adults working and seeking maximum efficiency.
“The rapidly advancing capabilities of AI may require a new north star that combines the concept of future-readiness for work and human readiness for life,” Babak Mostaghimi said at the conference. “AI ought to be built to be pro-human, i.e. to expand and augment our uniquely human capacities and capabilities rather than dull and replace our human advantage. In other words, I believe we need to center the ambitious notion of Pro-Human AI.”
Keep human relationships and connection at the center
Love is essential to learning. That sounds corny but it's true. In infancy, it is cuddles and warmth and responsiveness that builds babies brains. “To love to learn, we must first learn to love,” said Isabelle Hau, author of the beautiful book Love to Learn.
The Disengaged Teen makes the case for love as well, in the context of teens. To support your teen you need to connect with them: who they are, and who they are trying to become. As they get older, more of that happens in school and via learning, so understanding what’s happening there is critical. Parents support this through conversation. “Discussion is to teens what cuddles are for infants: necessary for brain development,” we write. To have good discussions, you need good connection.
So, when we design AI for education, we need to keep in mind that it must augment, not diminish, the role of relationships in learning. The best AI will supplement rather than replace teachers or tutors.
Learning is first and foremost social, and young people need to feel safe to learn. Relationships are the best way to do this. Relationships are slow and messy, awkward at times and, ultimately, they are the skill we need to make our way through a fast-changing and volatile world. Both because being adaptable learners helps, and because having good relationships buffers us from the vicissitudes of life. It seems too obvious to type but here goes: being in relationships teaches us how to manage relationships. Reading body language, managing conflict and emotions, asking for help, mucking through the awkward stuff of silences, and uncertainty. These are learner qualities we want to develop. They are human qualities too. Tech can play a role, but by no means replace it.
“The real question isn’t just ‘How smart is the technology?’ it’s ‘How human is the system we’re creating?’”Hau wrote on LinkedIn. “As AI companions and AI agents enter classrooms, homes, and daily life, we must ensure that technology doesn’t replace relationships but reinforces them. At the intersection of relational intelligence and artificial intelligence, we’re being called to design systems grounded in equity & deep human connection.”
Tech holds plenty of promise here. Lots of kids who won’t ask a question in front of a group of peers will ask a bot. This was a lesson from Covid, when educators noticed that when class was online and kids could DM them they got a wider range of students asking questions, not just the go-getter achiever types. Hicks from SchoolAI told the panel that kids trust their bots and share way more than we might realize. How do we build the support and scaffolding they need without creating a dependence on an ever-present, infinitely patient, not-human? (I will publish another post on some of the best tech I saw for learning! Stay tuned) This chart shows we are relying more and more on AI for emotional support. How do we make sure that’s additive to the human experience and not replacing it?
Practice agency
Rebecca and I defined agency as the ability to set a meaningful goal and marshal the resources to meet it. That means having the self knowledge to know what one cares about; setting a goal to meet it; deploying strategies and adapting when they don’t work, i.e. finding friends and family and teachers and tech to help meet the goal.
Kids need opportunities to practice having agency. Sitting in classes passively does not work for most older kids most of the time. Yes we need knowledge, and we need ways to apply that which center what we know about what motivates teens to learn. We need new ways to let young people participate in meaningful ways in their learning and make some—not all—of the decisions around what and how they learn. If we want kids who can make decisions we need to allow them chances to practice real decision making.
Tech can be a force for good here. But only if it is designed to make young people think and push themselves and not by making everything easier or digital. A design challenge for sure.
In sum: we need to NOT eliminate the struggle in learning; we need to focus deeply on helping young people get connected: to each other, to their learning and to their communities; and we need to find ways to let young people develop their agency—with tech and without.
*The conference title was certainly snappy, and apt for deploying AI in workplaces (maybe). Less so for learning.
My husband and I attended a talk on AI and education a couple of weeks ago and had expected to hear the usual "hurrah" for AI. Instead the speaker delivered a message that affirmed that we need effort and relationships, just as you point out here. See our discussion here: https://schooloftheunconformed.substack.com/p/learning-fast-and-slow-why-ai-will
Well said, Jenny, and thanks for sharing! The critical thinking issue is crucial; just because tech *can* provide a particular solution doesn't mean it *should.* It may not be in the best interest of users, especially if those users are students. Learning can be optimized, but the Blank Page struggle is necessary for learning. Thanks for sharing these thoughts to help us consider what the future of education can and ideally looks like.