AI promised to make us smarter. Instead, it might be rewiring the very thing that makes us human — our ability to think.
Inside the Mind Machines
Walk into the MIT Media Lab and the future hums quietly around you.
A robot sorts trash with algorithmic precision. An AI-sculpted tea set—built from imagined body parts—sits like a fever dream of innovation.
And somewhere between the glass walls and blinking sensors, research scientist Nataliya Kosmyna is studying how brains respond to all this progress. She’s part neuroscientist, part tech philosopher — and she’s worried.
Her latest project, a wearable brain-computer interface, was supposed to help people who can’t speak communicate through neural signals. But then came the emails — hundreds of them — from everyday people who said using ChatGPT had changed their brains.
They couldn’t focus. They forgot what they’d just written.
Could a chatbot be quietly rewiring human cognition? For a deeper exploration of how AI is changing the way we think, see The End of Thinking.
The Experiment That Sparked Panic
Kosmyna ran a small study: 54 students writing essays, some with no help, some using Google, and some using ChatGPT. While electrodes tracked their brain activity, the results came in stark and unsettling.
The more digital assistance people used, the less active their brains became.
Those who wrote with ChatGPT showed significantly lower connectivity in networks tied to creativity, attention, and problem-solving.
And when Kosmyna asked participants to recall what they’d written just minutes earlier, most couldn’t.
“Barely anyone in the ChatGPT group could quote a single sentence,” she says. “That was concerning, because you just wrote it and don’t remember anything.”
Her conclusion: our brains love shortcuts — but they learn through friction. When everything becomes frictionless, we stop growing.
The Frictionless Trap
Every major tech innovation promises to make life easier. That’s the sell.
But what if “easy” is the problem?
We no longer memorize phone numbers or directions. We don’t calculate in our heads. We outsource thinking to apps, creativity to algorithms, and memory to the cloud.
Kosmyna calls it “the cognitive outsourcing epidemic.” Others have dubbed it a “stupidogenic society” — a culture that makes it easier to become mentally lazy, just as an obesogenic society makes it easier to gain weight.
As the world grows more “frictionless,” real-world thinking starts to feel uncomfortable. Why call someone when you can text? Why read a book when TikTok can summarize it? Why write when AI can do it better, faster, and more confidently?
We’ve built a world optimized for convenience — and stripped of challenge.
Data Doesn’t Lie: The Cognitive Decline
This isn’t just anecdotal. Across developed nations, IQ and test scores are falling.
OECD’s PISA results — the gold standard for comparing student ability — have dipped since 2012.
For much of the 20th century, every generation got smarter. Better nutrition, more education, and access to knowledge lifted global IQs. Now, for the first time in modern history, that curve is bending downward.
Correlation isn’t causation — but the pattern is haunting. As tech grows smarter, humans seem to be thinking less.
Kosmyna puts it bluntly: “It’s only software developers and drug dealers who call people ‘users.’”
Continuous Partial Attention: The Modern Disease
In the 1990s, tech consultant Linda Stone coined the term continuous partial attention to describe how we live now — constantly connected, rarely focused.
It’s the modern mental fog: replying to emails while half-watching Netflix, doomscrolling while trying to relax, multitasking until your mind forgets how to single-task.
Stone’s research showed that 80% of people experience “screen apnea” — they literally stop breathing while checking notifications. Our nervous systems stay on high alert, pumping stress hormones through the body every time a new ping arrives.
Digital multitasking feels productive. But it’s a cognitive illusion — a sense of being “on top of things” while never getting to the bottom of anything.
It’s no wonder “brain rot” became Oxford’s 2024 Word of the Year.
Outsourcing Thinking
Generative AI didn’t invent this trend — it just industrialized it.
Until recently, we offloaded memory and data processing to machines. Now we’re offloading thought itself.
AI tools can now draft your emails, summarize your meetings, even brainstorm your creative ideas. But every time we let an algorithm think for us, our brain’s own critical circuits get a little quieter.
Researcher Michael Gerlich at SBS Swiss Business School found that heavy AI users scored lower in critical thinking. “It’s not that AI makes you stupid,” he says. “But it anchors your thinking. It narrows the paths your mind is willing to explore.”
His metaphor is brilliant:
“AI can make the world’s best candle — bright, efficient, beautiful — but it will never invent the lightbulb. That takes human chaos and curiosity.”
Without critical thinkers, the future risks being a factory of candles.
The Classroom Experiment
The decline isn’t limited to adults. In classrooms worldwide, AI has quietly rewritten the rules of learning.
A recent UK survey found 92% of university students use AI, and one in five admit to using it for entire assignments. Teachers say their students write longer essays that sound smarter but reflect little understanding of the topic.
“It’s not learning, it’s output,” says Matt Miles, a high school teacher in Virginia. “They can Google anything — but ask them to reason through an idea, and they’re lost.” For practical ways to strengthen reasoning skills, see Critical Thinking Exercises for students and lifelong learners.
His colleague Joe Clement adds: “Being able to find an answer isn’t knowledge. Knowledge is what lets you know when something’s wrong. That’s how you resist misinformation.”
Without that, he says, “we’re creating a generation of people who can produce content — but can’t think.”
The EdTech Illusion
Despite mounting evidence, the education system keeps doubling down on screens.
After COVID-19, digital learning platforms became the new normal. Tools like Google Classroom and Kahoot! promised “personalized learning” and “teacher efficiency.” But independent research tells another story.
The OECD found that students who use tech more frequently in school perform worse. Wayne Holmes of University College London calls it what it is:
“We’re experimenting on children with untested digital drugs.”
He’s not being metaphorical. Unlike medicine, educational technology isn’t required to prove safety or efficacy before being rolled out globally.
Brain Rot as a Business Model
If you’re wondering why the internet feels dumber, it’s not an accident.
Attention — not intelligence — is the currency of the modern web. Every app, algorithm, and notification is designed to keep you scrolling, not thinking.
Netflix builds shows for “casual viewing,” Spotify fills playlists with AI-generated filler, and social feeds reward outrage over insight. The system doesn’t need you to be informed — it just needs you to keep watching.
As a result, the population becomes perpetually distracted, vaguely anxious, and cognitively malnourished.
The Paradox of Progress
To be clear: AI can make us smarter. It already helps detect cancer, design materials, and accelerate scientific discovery. Humans working with intelligent machines can achieve incredible things.
But there’s a paradox at the core of progress:
If we let machines think for us instead of with us, we don’t evolve — we atrophy.
Maybe the “Golden Age of Stupidity” isn’t a joke at all.
Maybe it’s the natural consequence of a species that built machines to think, and then forgot how to.
The Question That Should Haunt Us
Socrates once warned that writing would make people forgetful — that it would give them “the conceit of wisdom” instead of real understanding.
He was wrong about writing. But what if he’s right about AI?
When your next idea is auto-completed, when your next essay is co-written by a bot, when your next decision is shaped by a recommendation algorithm — how much of you is left in the process?
We might not be getting dumber in the biological sense.
But in this frictionless, machine-assisted, endlessly optimized age—the very golden age of stupidity—we might just be forgetting what it feels like to think.
Visit: AIInsightsNews