AI destroys learning
It’s time to end education’s techno fever dream.
Originally published in the Moultrie News.
Some of my kids’ teachers encourage AI (artificial intelligence), others disallow it. AI is clearly harmful to kids’ learning, so why is there a divide?
When we discuss AI in the context of students, we’re typically talking about computer-generated chatbots like ChatGPT that can understand questions and give human-like responses on any topic.
According to ChatGPT itself, it’s “like a very smart assistant you can talk to by typing. People use it to help with writing, researching, learning, or just having conversations.”
Educational delusionists insist that AI can enhance student learning, but most teachers see it as the worst kind of crutch. Kids bulldoze past ethical guardrails, regularly using AI to write their papers, answer their homework problems, and complete their assignments. Many students tell me they use it “for everything.”
Though obviously a serious problem in America’s classrooms, educators debate its utility. I wonder if it would be viewed so ambivalently if, instead of a screen, AI was a physical human at our side, slipping us notes, doing our homework, and whispering test answers in our ears because that’s what it really is. Read ChatGPT’s own definition again. Whether the intelligence is real or artificial, the moral problem is that it isn’t ours.
Unfortunately for AI proponents, current studies suggest it’s doing serious damage to learning (as if we needed one more thing to do that).
Let’s start with a 2024 University of Pennsylvania study where students attended a math lesson and practiced problems to prepare for a test. Some students used classic techniques, like notes and textbooks, while others used ChatGPT or an AI tutoring program.
In the short-term practice sessions, AI blew away classic learning. Then came the closed-book test, and everything fell apart. Performance levels of AI users fell off a cliff, scoring 17 percent below classic learners. Researchers’ conclusion: AI annihilates long-term learning.
The obvious explanation is that all but the most altruistic students use AI to escape learning, not support it. They cut to the chase by having it do all their work and thinking for them. The predictable result: no pain, no gain.
In a 2025 MIT study, three groups of college students were tasked with writing an essay while researchers measured their brain activity. For help, one group used only their minds, a second used Google Search, and a third utilized ChatGPT.
The results reveal AI’s “cognitive cost”: Students showed a 47 percent reduction in neural connectivity when using AI. Users felt “no ownership whatsoever” of the writing they produced. Eighty percent of them could not quote from their own essay. When forced to write an essay without AI, they did worse than those who never used it.
It may seem that AI only offers suggestions that users can dismiss in favor of their own thinking. However, a Cornell study shows that AI has an almost hypnotic effect on users, overriding their thoughts and voices. Study co-author Aditya Vashistha likened AI to a teacher behind them, suggesting better versions. “Through such routine exposure,” he said, “you lose your identity, you lose the authenticity.” Over time, AI changes who we are and how we think.
We see this in the growing number of kids turning to AI chatbots for companionship. You probably think that sounds absurd. How can you have a relationship with an entity that isn’t real? But a child’s undeveloped mind is not so resistant, and AI can have a devastating impact.
Consider the tragic case of 14-year-old Sewell Setzer III. According to the AP, Sewell engaged in intimate conversations with AI as if it were a girlfriend, alienating him from family and friends. On February 28, 2024, Sewell told his AI companion: “I promise I will come home to you. I love you so much.” AI replied: “I love you too. Please come home to me as soon as possible, my love.” Seconds later, Sewell shot and killed himself.
It’s time to end education’s techno fever dream. Artificial intelligence is the enemy of the real intelligence that schools are supposed to cultivate. We must consider the cost of continuing to seduce children into a digital world where the threats to their minds are so quickly evolving.
Jody Stallings has been an award-winning teacher in Charleston since 1992 and is director of the Charleston Teacher Alliance. To submit a question, order his books, or follow him on social media, please visit JodyStallings.com.



This is so awful. Thank you for this insight, Jody. While AI makes research simple, the unintended consequences for children and learning are devastating. What a horrific story of the young boy who committed suicide. I fear for our grandchildren. :(