we're outsourcing our brains to AI and pretending it's fine
# we're outsourcing our brains to AI and pretending it's fine
I haven't memorized a phone number since I got a phone (excluding my parents since that was a requirment for me to get a phone). Like, why would I bother? My phone has everyone's number. It knows their address, their birthday, their email, their social media handles. My brain doesn't need to store any of that anymore.
This seemed totally fine at first. Like, obviously it's more efficient, right? But then I started noticing other things. I've also stopped memorizing directions. Facts. Math. Spelling. And honestly? I'm starting to forget how to think through certain types of problems entirely.
We're not just using AI as a tool anymore. We're using it as a replacement for actually thinking.
# The Progression of Outsourcing
1990s: We outsourced calculation
Calculators became standard. Mental math skills went down the drain. Teachers complained constantly. But society basically shrugged and said calculators are just tools, so whatever.
2000s: We outsourced memory
Smartphones showed up and suddenly nobody memorizes anything anymore. Phone numbers? Nope. Addresses? Why bother. Directions? Absolutely not. That information is always right there in your pocket, so why waste brain space?
2010s: We outsourced navigation
GPS completely killed map reading. I literally cannot navigate without Google Maps. Like, if my phone dies in an unfamiliar city, I'm genuinely screwed. Not "I'll figure it out" lost. I mean actually, completely lost. I have no idea how to read a physical map. Never learned.
2020s: We're outsourcing thinking
Now ChatGPT writes essays. Copilot writes code. AI summarizes articles, analyzes data, makes recommendations. We're not just using tools anymore. We're outsourcing the actual process of reasoning itself.
At what point does "tool" become "dependency"?
# The Difference Between Tools and Replacements
A hammer is a tool. It makes hitting nails easier. But if you lose the hammer, you can still hit a nail with a rock. The skill isn't lost. You just use a different method.
GPS isn't a tool though. It's a replacement. I never learned how to read a map properly because GPS existed when I started driving. The skill was never developed in the first place. If GPS disappeared tomorrow, I'd be completely screwed. The ability is just... gone. Never existed.
And honestly? I'm kind of pissed about it. I wish I'd learned map reading before GPS became the default. The irony is I'm actually good at orienteering. I've won a bunch of orienteering competitions through Scouts. But that's using a compass and a map in the woods, not navigating city streets. Different skill entirely. And I can't do the city navigation without my phone.
AI is doing the same thing. It's becoming a replacement, not a tool.
Students aren't using ChatGPT to check their essays. They're using it to write the essays, then maybe tweaking a sentence or two. The skill of actually constructing an argument from scratch? Never developed. They don't know how to do it.
Developers aren't using Copilot to speed up coding. They're using it to generate code they don't fully understand, then spending hours debugging when it breaks. The skill of thinking through logic from first principles? It's atrophying (can I get extra credit for using a big word Ms. Rode?).
I'm guilty of this too, by the way. I use Copilot constantly. But I also take AP Physics, and that forces me to think through problems manually. Physics is fun but super hard, and it's actually rewarding to learn it because you can't just ask AI to solve it for you. You have to understand the concepts. That kind of thinking is what we're losing.
And everyone acts like this is progress. Like we're evolving or something. I'm not so sure.
# We're Getting Cognitively Lazy (And It's Rational)
Here's the uncomfortable truth: outsourcing thinking makes total sense in the moment. Like, why wouldn't you?
Why people use AI instead of thinking:
- It's way faster (ChatGPT writes in seconds what takes humans minutes)
- It's often better (AI grammar and structure beat most people's first drafts, let's be honest)
- It's always available (no mental fatigue, no motivation issues, no "I don't feel like it" moments)
- It's free cognitive capacity (save your brain for harder problems, supposedly)
This is completely logical behavior. If a machine does something better and faster, why wouldn't you use it? I use it. Everyone I know uses it. It's just... smart.
But here's what we're trading away:
- The ability to think deeply when AI isn't available (what happens when servers go down?)
- Pattern recognition from doing things the hard way (you learn different things when you struggle)
- Understanding that comes from struggle (the "aha" moments that actually stick)
- Intellectual independence (being able to figure things out on your own)
- The satisfaction of actually creating something yourself (that feeling when you solve a problem without help)
We're making a deal: give up cognitive development in exchange for immediate efficiency.
Short term, it's a great trade. Long term? We're probably screwed. But who thinks about long term when the short term is so convenient?
And the worst part? You can't opt out. If everyone else uses AI and you don't, you fall behind. It's a race to cognitive dependence, and abstaining means losing.
# The "Surely We'll Still Learn the Basics" Cope
People say: "Don't worry, students will still learn fundamentals before using AI."
Will they?
Nobody learns map reading before using GPS.
When I got my license, driver's ed didn't teach map reading. Why would it? GPS exists. Teaching obsolete skills is apparently a waste of time. So I never learned. Now I'm dependent on a phone app to get anywhere.
Nobody learns mental math before using calculators.
Sure, elementary school teaches arithmetic. But higher math assumes calculator access. Nobody's doing integrals by hand when computers exist. Why would you? It's slower and you're more likely to make mistakes.
Why would anyone learn writing before using ChatGPT?
If AI can write better than most humans, why spend years teaching essay structure? Just teach "how to prompt AI effectively" and move on. That's probably what's going to happen, honestly.
This sounds dystopian. It's also the logical endpoint of educational efficiency.
Why teach skills that machines do better?
# The "AI Is Just a Tool" Cope
Every time this comes up, someone says: "AI is just a tool, like a calculator. We still need to know what to do with it."
This is technically true and practically misleading.
Calculators required understanding:
- You need to know which calculation to perform
- You need to interpret the result
- You need to verify it makes sense
- The calculator doesn't think for you
ChatGPT requires way less understanding:
- Just describe what you want in plain English
- AI figures out the approach
- AI generates the output
- You just accept or reject it
The cognitive load is massively different.
Using a calculator: "I need to divide 847 by 23. Calculator says 36.8. That seems right." You still had to know what operation to do and whether the answer makes sense.
Using ChatGPT: "Write me an analysis of the impacts of social media." AI outputs 5 paragraphs that sound smart "Sure, that works." You didn't have to think about structure, arguments, or even what points to make. Just read it and decide if it sounds good enough.
One requires understanding. The other requires reading comprehension. Big difference.
# We're Creating a Two-Tier Society
There's a split emerging:
Tier 1: People who understand AI
- Know how it works (training data, weights, transformers)
- Know its limitations (hallucinations, biases, context windows)
- Use AI as an extension of their thinking
- Could function without AI if necessary
Tier 2: People who depend on AI
- Know it "just works" somehow
- Don't understand limitations or failure modes
- Use AI as a replacement for thinking
- Would be helpless without AI
The scary part? Tier 2 will include most people. Including people in important positions.
Future scenario (probably happening already):
- CEO asks AI to analyze market trends
- AI generates report with confident but flawed reasoning
- CEO doesn't understand the domain well enough to spot the error (because they've been using AI for analysis for years)
- Company makes multimillion-dollar decision based on AI hallucination
- Everyone's confused when it goes wrong
We're replacing human expertise with AI confidence. Those aren't the same thing, but we're treating them like they are.
# What Happens When AI Goes Down?
Our society has critical dependencies:
Power grid: If it fails, we have generators and backup systems
Internet: If it fails, we have offline backups and manual processes
AI: If it fails, we... have no plan because we assume it won't fail
But AI can fail:
- Servers go down (happens regularly)
- Models get worse (happened with GPT-4 degradation)
- Companies shut down services (RIP Google products)
- Cyberattacks or sabotage
What happens if ChatGPT goes offline for a week?
- Students can't finish assignments (because they don't know how to write without AI)
- Developers can't meet deadlines (because they can't code without Copilot)
- Customer service collapses (because chatbots handle 80% of queries now, looking at you Capital One...)
- Productivity craters across industries
The scary thing? This isn't even a stretch. We're maybe 2-3 years from this level of dependency. Maybe less. I know people who already can't work without AI.
# The Cognitive Atrophy Problem
Use it or lose it isn't just a saying. It's neuroscience.
Skills our generation has lost to technology:
- Spelling (spell check catches everything)
- Mental math (calculators are always available)
- Remembering facts (Google knows everything)
- Navigation (GPS handles it)
- Handwriting, specifically cursive (everyone types now)
For people who grew up with technology, these skills never developed properly in the first place.
What happens when the next generation loses:
- The ability to write without AI assistance (they never learned)
- The ability to code without AI autocomplete (they started with Copilot)
- The ability to analyze problems without AI prompts (they don't know how to think through it manually)
- The ability to think deeply without external tools (the muscle never developed)
Society becomes cognitively dependent. Like someone who skips leg day for years, then can't walk without assistance. Except this time it's our brains, and we can't just start going to the gym to fix it.
# The Counterargument: "Every Generation Says This"
Socrates complained that writing would destroy memory. People said calculators would make us dumb. The internet was supposed to rot our brains.
We adapted. We're fine.
Why this time might be different:
Writing: Outsourced information storage, not thinking
Calculators: Outsourced computation, not reasoning
Internet: Outsourced information retrieval, not understanding
AI: Outsources thinking itself
Each step moved higher up the cognitive stack. We keep going deeper.
Memory → calculation → information access → reasoning
We're running out of cognitive layers to outsource. What's left after we automate thinking? I don't know. Consuming? Is that it?
# Are We Okay With This?
Honest question: do we want a world where humans don't need to think?
The Silicon Valley pitch:
- Focus on creativity and innovation instead of routine, boring work
- More free time for relationships and experiences
- Access to expertise for everyone
- Faster progress on hard problems
The actual reality:
- Loss of intellectual independence
- Society-wide cognitive decline
- Extreme vulnerability to technology failure
- Concentration of power in who controls the AI
- Humans reduced to clicking "regenerate" until AI outputs something acceptable
This isn't like the transition from physical labor to knowledge work. That gave us new jobs. This is different.
Machines took over manual tasks, humans moved to cognitive tasks. Now AI takes over cognitive tasks, humans move to... what exactly?
Creative work? AI generates art and music better than most people. Midjourney makes better images than I could ever draw.
Strategic thinking? AI does analysis and planning faster than we do. It can process way more data.
Emotional labor? AI chatbots provide therapy and companionship without judgment. Some people already prefer talking to AI.
What's left? Consuming? Is that the endpoint? Humans as passive consumers of AI-generated everything? Just clicking "regenerate" until we like what we see?
That's bleak. And nobody's really talking about it. Or if they are, I'm not hearing it.
# The Resistance Is Futile (But Necessary)
Some people still try to think without AI. I try to sometimes, though I fail a lot.
They'll write first drafts without ChatGPT. Code functions without Copilot. Solve problems manually before asking AI.
Not because it's faster. It's definitely not. Not because it's better. It usually isn't.
Because they don't want to become helpless when the machine is unavailable. Because they want to keep the skill, even if it's slower.
Is this the equivalent of insisting on using a typewriter in 2025?
Probably. But at least typewriters don't require internet access or a subscription.
There's something fundamentally wrong about outsourcing your brain to a corporation's servers. We're trading independence for convenience, and acting like it's an upgrade. It's not an upgrade. It's a trade-off, and we're not being honest about what we're losing.
This is a bad deal. But it's also the only deal available.
The people who resist get left behind. The people who adopt it lose their cognitive independence. There's no winning move here, just choosing which way you want to lose. And most people don't even realize they're choosing.
# The Uncomfortable Endpoint
We're moving toward a world where AI does most cognitive work and humans... oversee it? Enjoy the output? Become obsolete?
The optimistic view: humans are freed from tedious thinking to focus on meaning, creativity, and connection.
The pessimistic view: humans become intellectually dependent, cognitively atrophied, and unable to function without AI assistance.
The realistic view: The pessimistic scenario is already happening. We're not heading toward cognitive dependence. We're already there. And it's accelerating faster than anyone expected.
Some people will use AI as a tool to amplify their thinking. Most people will use AI as a replacement for thinking. The gap between those groups will become a chasm, and the people who control the AI will control everyone else. That's already starting to happen.
This isn't progress. It's a gamble.
Society is betting that AI companies will act in our best interests, that the technology won't be weaponized, that we'll somehow adapt before we lose the ability to think independently. That we'll figure it out.
That's a bad bet. But we're making it anyway, because what else are we supposed to do?
Update: wrote most of this at 2am after procrastinating on physics homework. Probably should've been studying but this felt more important. My French teacher would be disappointed.
Joke of the day: Why did the AI break up with the calculator? Because it found someone with better algorithms. (I know, it's bad. But it's 2am.)