This article was co-authored by Pablo Garcia Quint, a Tech and Innovation Policy Fellow at Libertas Institute.
The use of AI in schools and universities has ignited widespread concern about its potential to undermine learning. Critics argue that AI is making cheating more acceptable and undermining students’ ability to develop core academic skills.
But fears about new technology as a crutch for learners are hardly new. Shortcuts for avoiding deeper learning have always existed. The internet itself triggered similar alarms, and long before AI, students used tools such as SparkNotes, group chats, smartphones, and paid homework services.
So, we should ask ourselves whether AI is the scapegoat or whistleblower in education?
Education in the United States has been in decline for decades. For instance, the concern of falling SAT scores dates back to the mid-1970s. More recently, between 2016 and 2023, before AI became widespread, the percentage of students failing to meet benchmarks in reading, writing, and math rose to 23%, with overall scores declining by 3%.
On top of it, student disengagement has become a growing concern. Gallup shows that between 25% and 54% of students find the material they are learning either unimportant or uninteresting. Fewer than half say their classes help them learn.
AI is now exposing a fundamental weakness of traditional education: shallow, standardized assignments that lack meaning. Generic essay prompts, fill-in-the-blank worksheets, canned lab experiments, and surface-level history projects do little to promote curiosity and critical thinking.
Students using AI to complete such busywork are not lazy, they are rational. If a chatbot can complete a generic assignment in seconds, the assignment was not a good one.
Schools often claim to value critical thinking. Yet the structure of modern schooling—bell schedules, rigid pacing guides, and one-size-fits-all grade levels—makes genuine intellectual exploration the exception rather than the norm.
AI is opening the door to a new type of learning. Rather than forcing all students through a standardized curriculum, we can create more personalized and meaningful educational experiences. This is already happening in small private schools that leverage AI without sacrificing a sense of community. One AI company, for example, is developing a tool that allows students to engage in dialogue with simulated historical figures such as Aristotle.
This kind of innovation is not hypothetical. One of the authors has a 17-year-old daughter who is an aspiring writer. She utilizes AI to aid in developing character backstories. The tool does not replace her writing. It helps her refine and strengthen them.
Some schools are embracing this opportunity to rethink education. Alpha School in Texas has students use AI to complete traditional academics in about two hours a day. The remainder of the day is theirs to explore personal interests. One student is developing a dating app designed for teens. A younger student fills her time with wilderness survival training, swimming lessons, and cooking classes. The results speak for themselves. Alpha students’ test scores are in the top 2% of the country.
AI is not undermining learning. It is promoting it.
Whether we like it or not, AI is now part of the school experience. The goal should not be to ban it or permit its unchecked use, but to teach students how to use it well.
There are many ways we can start adapting AI without making drastic reforms. Some of the steps schools can start to implement are replacing lectures with open-ended class discussions. These discussions do not rely on right or wrong answers, but instead invite students to contribute ideas, test hypotheses, and deepen their understanding collaboratively.
Equally important is replacing worksheets with real-world problems. Members of Gen Z want to make an impact. Teachers should let students bring problems they care about into the classroom and help them research and develop solutions.
Lastly, inviting students to design experiments and interpret messy, real-world data. Too often, textbook problems have tidy solutions. Life rarely works that way. Students should learn how to navigate uncertainty through their own experiments, even if the results are inconclusive.
These aren’t just pedagogical upgrades, they’re necessary shifts in a world where AI is changing how students learn and think. Banning the technology won’t shield students from its influence, and ignoring it won’t prepare them for the future.
If schools fail to adapt to this new tool, students will be the ones who lose out. Not because they used AI, but because schools failed to teach them how to use it wisely.