
Thought Leadership
Productive Struggle: The Imperative of Friction in AI-Driven Learning
VP Higher Education, Kaplan International PathwaysProductive Struggle: The Imperative of Friction in AI-Driven Learning
By Ricard Giner, Vice President, Higher Education, Kaplan International
Like many people in and out of the education world, I was pretty dismayed by the findings of a recent MIT Media Lab study on the impact of generative AI on student learning. It showed that college students assigned to write essays using ChatGPT, then later asked to write on the same topics without tech assistance, struggled compared to peers who wrote independently from the beginning. They remembered less of what they had previously written, performed more poorly, and were less cognitively engaged.
The implication, unsurprising to those who have been AI-cautious all along, is that too much reliance on generative AI thwarts development of critical thinking and memory. It would be easy to conclude, as the fiercest critics have, that AI simply has no place in education.
I get the worry and the frustration that leads people to this stark conclusion. I really do. And yet along with the much-discussed negative effects of AI, its immense capabilities actually have transformational potential for individuals and for society – such as scaling instant-access, high-quality, 24/7 free personalized tutoring for millions of students. To unlock its educational benefits, however, we need to embrace a paradox: Instead of using AI to make learning easier, we need to use it to create the productive struggle that lies at the heart of true learning.
Struggle is an inescapable part of the human condition. And in education, that’s not a bad thing. Whether or not you’re steeped in the latest details of learning science, if you’ve ever seen a child learning something it would be hard to miss the pleasure they take in overcoming a challenge. Far from preferring the smoothest possible path, they find joy in everything from walking up a playground slide to, as teenagers, mastering a tricky equation.
If we see navigating and surpassing obstacles as a feature, not a bug, of effective education, that means we need to build friction into AI education. The most savvy thinkers about AI-driven pedagogy increasingly steer students toward using AI to do everything from creating flashcards and quizzes to giving feedback on draft essays using the personas of demanding expert readers. That’s a night-and-day contrast to the laziest approach so many educators rightfully fear, in which students simply use AI as a crutch to do their work for them.
After all, critical thinking is a core skill in the AI era. Yes, this phrase gets thrown around a lot and it needs to be carefully defined. But the MIT study shows how deeply concerned we should be when offloading cognitive tasks to AI undermines human engagement and judgment. This really goes back to the key tools of human reason that were codified during the Enlightenment. We need to evaluate evidence, make arguments, and remain open to new information in order to separate the true from the false, and the credible from the implausible.
Developing these skills in young people is more important than ever. When we’re flooded with misinformation and disinformation, the ability to discern credible information from the muttering of a charlatan cannot be overstated. Because critical thinking is so much more powerful than passively receiving algorithmic recommendations, cultivating intellectual curiosity is a vital counterweight to AI dependency. Educators need to equip young people with the tools to engage actively in assessing - and questioning the world around them. AI can play a role in both parts of that equation - being among the critical engagement tools, as well as the outputs being questioned.
Approaching this task thoughtfully will mean rebuilding educational frameworks. For one thing, we’ll need to redesign classroom curricula to move even further in the direction of prioritizing critical analysis over simply acquiring and memorizing information. Crucially, we’ll also have to teach students to interact with AI as a tool that their own judgment can guide and evaluate. It can never be a substitute for thinking.
With this educational mission in mind - recognizing that learners have to work to make AI truly empowering - there’s even a paradox within the paradox to consider. We often hail the ability of AI and other tech to personalize how students are taught and how information is delivered to individuals on a carefully tailored basis. But over-personalization has its own significant risks. Excessive personalization can undermine human resilience and adaptability. The implication: University degrees that have been exquisitely personalized may become less valued. Conversely, a graduate who has learned to apply their own careful thinking to evaluate AI-driven analysis may be more useful than ever.
As most of us eventually learn through experience, the real world is remarkably indifferent to exactly who you are and what personal needs you have. It simply presents you on a daily basis with challenges that you’ll need to overcome without regard for your personal perspective or limitations. That’s why the tools of critical thinking are so empowering - they work for everybody, no matter what your personal circumstances.
In the end, it’s entirely possible to accept that AI can help us think and learn while also recognizing that educators need to give students both challenges and safeguards to use it well. Learners need to seize on the capabilities of AI as an opportunity to adapt and improve, not melt into passivity and resignation.
It’s worth noting that Wharton School professor Ethan Mollick, an AI expert who believes in its benefits, still writes and rewrites his own initial drafts before seeking AI feedback. Our best approach to keeping students motivated in a confusing AI-filled world is to challenge them - and to help them keep their own reasoning and judgment front and center.