AI Doesn't Make You Dumber. You're Just Measuring the Wrong Thing.
The studies keep coming. “AI is making us dumber.” “Students who use ChatGPT perform worse on tests.” “Dependency on AI erodes critical thinking.”
The latest is MIT’s “cognitive debt” study - ChatGPT users showed lower brain engagement during essay writing. The headlines were predictable. But even the researchers pushed back. “Please do not use words like ‘stupid’, ‘dumb’, ‘brain rot’,” they wrote. “We did not use this vocabulary in the paper.”
Short answer: No, AI doesn’t make you dumber. But it changes what you need to be good at.
I’ve used AI tools daily since GPT-3 dropped - three years across design and code. What I’ve found is that these studies aren’t wrong. They’re measuring something that matters less than it used to.
The Studies Are Real - But Limited
What the studies actually measure
Most AI-and-cognition research tests the same things:
- Can you remember the answer without help?
- Can you complete the task solo?
- Do you retain information after the AI is gone?
By these metrics, yes - AI users often perform worse. We’re more “dependent.” We recall less. We struggle more when the tools are taken away.
But these metrics assume a world where the tools get taken away. A world where knowing the syntax matters more than understanding the system. A world where doing the work yourself is the point.
That world is disappearing.
What they miss
Three years ago, I’d completed a few beginner JavaScript courses. That was it. React? No idea. The syntax tripped me up - the order of things, where the brackets go, why this works but that doesn’t. I’d hit a wall, get frustrated, and go back to design tools where I knew the rules.
Now I can read JavaScript. I can read React. I understand the fundamentals - functions, objects, methods. Not because I sat down and memorised the syntax. Because I stopped needing to.
I know what needs to happen. I can articulate it. And AI handles the friction that used to stop me cold.
Is my code the best in the world? No. But the more I read, learn, and apply it, the better I get. The syntax isn’t the blocker anymore. Understanding is. And understanding comes from doing the work.
I’m a designer who can read React. That used to require a career change. Now it requires curiosity and a few hundred conversations.
Here’s what that looks like day to day:
I hit a problem. I describe it. The AI gives me something. It’s usually wrong in some way, or right but not quite fitted to my context. So I have to understand what it gave me well enough to fix it.
That’s not offloading my thinking. That’s externalising it. Offloading means you stop thinking. Externalising means you think somewhere else.
More designers are going to skip the traditional tools entirely. Go straight into code - or rather, straight into natural language. Describe what you want. Iterate on what you get. Ship.
What AI Does to Thinking
The articulation effect
Something strange happens when you explain a problem to an AI. You have to be precise. Vague prompts get vague answers. So you sharpen your question. You add context. You notice the gap between what you think you want and what you need.
This is thinking. It’s just thinking out loud.
I know my craft. But talking through a problem - even to a machine - surfaces knowledge I didn’t know I had. The act of prompting forces articulation. And articulation is understanding made visible.
It’s rubber duck debugging, except the duck talks back. And sometimes hallucinates.
Chiselling marble
Here’s how I think about it.
Each prompt is a chisel strike. Each iteration refines the shape. Each conversation removes material you didn’t need.
You’re not asking an oracle for answers. You’re sculpting understanding through dialogue. The marble was always there - your ideas, your instincts, your half-formed sense of what you’re trying to build. The AI helps you find the form inside it.
Some conversations take dozens of exchanges before the shape emerges. That’s the process working. You’re thinking through the problem out loud, with a collaborator that never gets bored.
The Real Skill Shift
What changed
The entry point dropped.
The minimum viable knowledge to participate has collapsed. You can work with code without memorising syntax. You can reason about systems without years of prerequisite study. You can start doing the interesting work sooner.
The higher-order skills - understanding systems, integrating ideas, knowing good output from bad - were always the valuable ones. AI didn’t change that. It just made them more accessible.
The question isn’t “can you do it without help?” The question is “can you get the right result?” And increasingly, the answer depends on how well you collaborate with tools that think differently than you do.
The dependency argument
“But what if the AI goes away?”
What if the internet goes away? What if spreadsheets go away? We’ve been augmenting cognition with tools for as long as we’ve been human. The printing press didn’t make us dumber because we stopped memorising entire books.
Every tool creates dependency. Dependency isn’t diminishment. The question is whether the dependency lets you operate at a level you couldn’t reach alone.
What it costs
This isn’t free.
When AI solves something on the first try, you learn less than when you wrestle with it yourself. The easy wins are shallower than hard-won understanding.
But the calculus changes when you’re building something real. I’ll take broader reach with selective depth over narrow expertise I can’t apply.
The Digital Renaissance
Here’s where this gets bigger than productivity tips.
We’re entering a renaissance. Not despite AI, but because of it.
Think about what a renaissance is. It’s not just an artistic movement. It’s what happens when new tools collapse the barriers between disciplines - when ideas trapped in silos suddenly cross-pollinate.
The printing press didn’t just spread existing knowledge. It let thinkers in different fields discover each other. The output compounded because collaboration became possible at scale.
AI is doing something similar. It’s becoming the glue between specialisms.
When a designer can reason about code, when a developer can iterate on copy, when a strategist can prototype without waiting on a build queue - the handoffs compress. The specialists still matter. But now they can collaborate at the speed of thought instead of the speed of scheduling.
The most ambitious work has always required experts from multiple disciplines pulling together. What changes is how many of those projects become possible. How many ideas ship instead of dying in the gap between concept and capacity.
Look at Shopify’s Winter 2026 Editions site. They called it “RenAIssance.” It’s a 3D, immersive, award-winning experience - parallax, motion, layered storytelling. The kind of ambitious work that requires designers, developers, 3D artists, and motion specialists all pulling in the same direction. I’d bet good money AI acted as the glue - not replacing the experts, but compressing the handoffs between them. That’s what the new renaissance looks like: specialists collaborating at a pace that wasn’t possible before.
What this means
The studies will keep coming. They’ll keep measuring recall and solo performance and short-term retention. And they’ll keep finding that AI users depend on AI.
But the metrics we use shape what we see.
I ship more. I understand more. I ask better questions than I did three years ago.
The skill isn’t knowing. It’s knowing what to ask.
And that’s something you can only learn by using the tools.