AI NewsletterSubscribe →
All articles
Human Intelligence6 min read

Does AI Make Clever People Stupid?

If you use AI to do the work for you, you learn nothing. Research from Gerlich (2025) shows a significant negative correlation between frequent AI tool usage and critical thinking abilities — particularly in younger workers. Here's what to do about it.

Larry Maguire

Larry Maguire

1 December 2025

6 min read
1 December 2025

I delivered a half-day AI session with Bernie Goldbach to a group of supply chain students in Kemmy Business School at UL during the summer. I stressed to them the fact that if they use AI to do the work for them, they learn nothing. Their time studying — if only to obtain a piece of paper, as a means to an end — will be pointless in all practicality. I said: "If you use ChatGPT to write your assignments, you'll remain just as stupid as you were before you started."

Controversial language, I'll admit, but it was designed to be, and it's not untrue. More accurately: if you and I rely on AI to make decisions for us, to be the arbiter of truth and accuracy, we are dumbing down our own intellect and risking our jobs and our businesses. Eventually, our minds will atrophy, just as our limbs would if we stopped using them.

Some voices in education, learning and development are fighting hard against AI, and I think that's a mistake — even naive. The technology is here, the cat is out of the bag. Knowing students are already using AI, how can we assist them in using it in a way that enhances their personal knowledge and critical thinking? If we can't do that, we're driving the behaviour underground and pretending change hasn't happened.

Knowing students are already using AI, how can we assist them in using it in a way that enhances their personal knowledge and critical thinking? If we can't do that, we're driving the behaviour underground and pretending change hasn't happened.

The Value In Challenge and Difficulty

Technology has always served to simplify and augment human effort. From the abacus 4,000 to 5,000 years ago, to the plough, the printing press, electric motors, digital currency — we find ways to work better, more efficiently, and in doing so, we outsource mental and physical effort to mechanisms and software.

This outsourcing of human cognition has accelerated in recent times with the advent of the internet. Never before have we witnessed the speed with which this is happening as Generative AI enters everywhere humans work. Soon, we are promised, the machines will handle all our daily mental tasks and simplify all the challenges of life and work. Finally, the life of ease we've always been promised will be ours.

Utopia or dystopia — I can't decide which.

Surely the challenge and effort in learning something or figuring out an answer to a complex problem is where real meaning and purpose lies. Achievement without effort seems like no achievement at all. Aristotle seemed to think so. In Nicomachean Ethics, he railed against the hedonistic life that sought ease and instant gratification, referring to it as a slavish pursuit. He said that true happiness is found in the expression of excellence and virtue — in the doing well of what is worth doing. Eudaimonia (human flourishing) is found in tasks that are inherently challenging yet worth doing in and of themselves.

If you and I rely on AI to make decisions for us, to be the arbiter of truth and accuracy, we are dumbing down our own intellect and risking our jobs and our businesses. Eventually, our minds will atrophy, just as our limbs would if we stopped using them.

What Is Cognitive Offloading?

Cognitive offloading refers to the delegation of human cognition to external artefacts and systems. In the workplace, it contributes to how we learn, make decisions and judgements, and maintain or develop new skills. Used selectively, it can save time and help people get things done efficiently. Used indiscriminately, it displaces effortful thinking and weakens learning and retention — negatively influencing our capacity to make informed decisions.

The extended mind thesis (Clark & Chalmers, 1998) treats external resources as integrated parts of cognition. Long before Generative AI, lists, diagrams, and calculators helped augment our memory and problem-solving. The brain's metabolic cost favours these strategies, freeing our finite attention for higher-order reasoning. But with Generative AI tools, we're farming both System 1 and System 2 thinking — fast and slow — to machines.

Michael Gerlich on Cognitive Offloading

Prof. Dr. Michael Gerlich released a paper earlier this year investigating the relationship between AI tool usage and critical thinking. The study surveyed 666 participants across diverse age groups and educational backgrounds. It found a significant negative correlation between frequent AI tool usage and critical thinking abilities, with younger people exhibiting higher dependence on AI tools and lower critical thinking scores compared to older people. Higher educational attainment was associated with better critical thinking skills regardless of whether AI tools were used or not.

Gerlich recognises that while cognitive offloading can improve efficiency, extensive reliance on AI may reduce the need for deep cognitive involvement. Sparrow et al. (2011) demonstrated this through the "Google effect" — people's tendency to remember where to find information rather than the information itself. When AI tools provide quick solutions and ready-made information, they discourage the cognitive processes essential for critical thinking.

Recommendations Based on Gerlich 2025

Assess Reasoning Rather Than Results. The study shows cognitive offloading produces correct outputs while eroding the underlying thinking skills. Evaluation should capture how learners arrived at conclusions, not whether AI helped them get there.

Put Effort First. Workers who engage cognitively before accessing an AI are likely to retain more and develop a foundation for critical analysis. Design learning experiences where participants attempt to solve problems independently before employing AI.

Use AI for Practice and Idea Generation. AI can create scenarios, case studies, and provide feedback on learner responses. But the explanatory and sense-making work should remain with the learner to build critical thinking capacity.

Develop Metacognitive Skills. Learners need to recognise when AI use undermines their own abilities and development. Training should explicitly address metacognition — thinking about thinking — and when offloading is or is not appropriate.

Target Younger Cohorts. The study found younger participants and those with lower educational attainment showed higher AI dependence and lower critical thinking scores. These groups may need more tailored AI learning programmes at work.

Balance Efficiency Needs with Human Thinking Skills. AI tools offer genuine productivity benefits — I've seen this first hand. But the goal should not be elimination, instead collaboration and augmentation. The AI tools you choose must complement rather than replace the cognitive work that your organisation needs to successfully navigate this time of rapid change.

cognitive offloadingcritical thinkingAI toolslearningperformance
Larry Maguire

Your AI Trainer

Larry G. Maguire

Work & Business Psychologist | AI Trainer

MSc. Org Psych., BA Psych., M.Ps.S.I., M.A.C., R.Q.T.U

Larry G. Maguire is a Work & Business Psychologist and AI trainer who helps professionals and organisations develop the skills they need to integrate AI in the workplace effectively. Drawing on over two decades in electronic systems integration, business ownership and studies in human performance and organisational behaviour, he operates in the space where technology meets people. He is a lecturer in organisational psychology, career & business coach with offices in Dublin 2.

GenAI Skills Academy

Achieve Productivity Gains With AI Today

Send me your details and let’s book a 15 min no-obligation call to discuss your needs and concerns around AI.