# we're in a digital renaissance and nobody notices

History classes teach the Renaissance like it was obvious: of course it was important, look at all the art and science and cultural change! It's so clear in hindsight.

But people living through it didn't wake up thinking "wow, I'm experiencing a Renaissance today." They were just... living. Dealing with new printing presses, weird paintings that looked too realistic, and ideas that challenged everything they'd been taught. It probably felt chaotic and confusing.

Sound familiar? Because that's exactly what AI feels like right now.

# The Renaissance Wasn't About Art

Yeah, we remember the Renaissance for Michelangelo and da Vinci. But the real revolution was technology.

The printing press (invented 1440) did to information what AI is doing to knowledge work. It's actually a pretty good parallel:

  • Before: Knowledge controlled by elites (monks copying manuscripts by hand, took forever)
  • After: Information became democratized and spread rapidly (anyone could print books)
  • Result: Protestant Reformation, Scientific Revolution, total restructuring of society

One technology changed who had access to information, which changed who had power, which changed everything. That's what's happening with AI right now.

What AI is doing right now:

  • Before: Specialized skills required years of training (art, writing, coding, analysis)
  • After: Anyone can generate content, code, or insights instantly (with a good prompt)
  • Result: We're about to find out, but it's going to be massive. Like, really massive.

The printing press didn't just make books cheaper. It fundamentally altered the relationship between knowledge and power. Who could read, who could write, who could spread ideas.

AI isn't just making tasks faster. It's changing what it means to create, think, and work. The fundamentals are shifting.

# We're Creating Our Own Dependencies

Renaissance Europe became dependent on printed materials within a generation. Try going back to hand-copied manuscripts after you've seen a printed book. Not happening. Once you've tasted the convenience, there's no going back.

How fast we've become dependent on AI:

  • 2022: "ChatGPT is neat, I guess" (tried it once, thought it was overhyped)
  • 2023: "Okay this is actually useful for some things" (started using it for emails)
  • 2024: "Wait, how did I write emails before this?" (using it for everything)
  • 2025: "I literally can't code without Copilot anymore" (this is me, I'm the problem)

It took less than three years. That's insane. We went from "this is cool" to "I can't function without this" in the time it takes to finish high school.

Things people can't do efficiently without AI anymore (or won't, because it's slower):

  • Write documentation (why format it manually when AI does it instantly?)
  • Debug obscure errors (AI reads stack traces faster than humans, and I'm lazy)
  • Generate boilerplate code (developers don't write CRUD endpoints manually anymore, why would you?)
  • Draft professional emails (just paste your thoughts and AI makes them sound professional)
  • Research topics quickly (AI summaries beat reading full papers, let's be honest)

People aren't saying they couldn't do these things manually. They're saying they won't, because the AI version is faster and often better. And honestly? They're right.

That's not a tool. That's a dependency. We're dependent on it now.

# The "AI Can't Really Think" Argument Is Missing the Point

Renaissance skeptics said printed books would never replace illuminated manuscripts. They were right. Printed books were objectively lower quality. They looked worse, felt cheaper, had no custom artwork.

Hand-copied manuscripts had:

  • Beautiful calligraphy (actual art)
  • Custom artwork (unique to each copy)
  • Attention to detail (took forever)
  • Spiritual value from human effort (you could feel the work)

Printed books had:

  • Worse typography (early printing was rough)
  • No custom artwork (mass production)
  • Mass-produced feel (cheap)
  • No "soul" (just ink on paper)

Guess which one won? The cheaper, faster one. Always.

Modern AI skeptics say:

  • "AI doesn't truly understand, it just predicts tokens" (true)
  • "It can't reason like humans do" (also true)
  • "There's no actual intelligence, just pattern matching" (yep)
  • "It makes mistakes and hallucinates" (definitely true)

All technically true. Also irrelevant, because that's not why people use it.

Printed books weren't "better" than manuscripts. They were faster, cheaper, and scalable. That's what mattered. Quality didn't matter as much as accessibility.

AI isn't "smarter" than humans. It's instantly available, infinitely patient, and constantly improving. That's what matters. The philosophical questions about "real" intelligence don't matter when it's 3am and you need code that works.

# The Speed of Change Is Accelerating

The printing press took decades to reshape Europe. The steam engine took generations to industrialize the world. The internet took years to become essential.

AI timelines (this is wild):

  • GPT-3 (2020): Impressive but limited (most people didn't even know it existed)
  • ChatGPT (2022): Went viral in days (everyone was talking about it)
  • GPT-4 (2023): Enterprise adoption explodes (companies started using it everywhere)
  • 2024: Every major company has an AI strategy (you're behind if you don't)
  • 2025: Students can't remember writing essays without AI (this is me and my friends)

We're compressing centuries of societal adaptation into years. Maybe months. The speed is actually terrifying if you think about it.

The Renaissance happened slowly enough that society could adjust. What happens when the revolution is faster than humans can adapt? We might be about to find out.

# What Makes This Actually Like the Renaissance

It's not just "new technology changes things." Every era has that.

What makes this a Renaissance-level event:

1. Democratization of elite skills

  • Renaissance: Literacy and education spread beyond nobility
  • Now: Creative and analytical skills accessible to everyone

2. Power structures shifting

  • Renaissance: Church loses information monopoly
  • Now: Credentialed experts lose knowledge monopoly

3. New forms of creation

  • Renaissance: Perspective in art, scientific method, new literature
  • Now: AI-assisted art, code, writing, research, analysis

4. Fundamental questions about human value

  • Renaissance: "What makes humans special if not divine creation?"
  • Now: "What makes humans special if AI can do our jobs?"

5. Economic restructuring

  • Renaissance: Rise of merchant class, decline of feudalism (big shift)
  • Now: Rise of... something. We don't know yet. That's the scary part. What jobs will exist? Who will have power? We're making it up as we go.

# The Part Nobody Wants to Talk About

The Renaissance was great if you were a wealthy merchant in Florence. Less great if you were a monk whose job was copying manuscripts. Your entire career just became obsolete.

Jobs AI is already replacing (or making way easier, which means fewer people needed):

  • Customer service (chatbots are good enough now, and they're getting better)
  • Basic copywriting (GPT writes product descriptions that are fine)
  • Simple coding tasks (Copilot handles boilerplate, so why hire junior devs?)
  • Entry-level data analysis (AI does preliminary analysis faster)
  • Language translation (better than humans for common languages, and cheaper)

Jobs AI will probably replace soon:

  • Most administrative work
  • Junior developer positions
  • Content moderation
  • Basic tutoring and education
  • Preliminary legal research
  • Medical diagnosis assistance

"But humans will always be needed for creativity and judgment!"

Renaissance scribes said the same thing. Turns out most people didn't need beautiful calligraphy, they just needed information. The beautiful stuff was nice, but not necessary.

Most companies don't need perfect human judgment. They need fast, cheap, good-enough decisions at scale. And AI does that better. Or at least cheaper.

# Are We Okay With This?

Here's the uncomfortable question: should we be building technology that makes us dependent on it?

The printing press was probably good for humanity overall. But it also enabled propaganda, misinformation, and weapons of mass persuasion.

I'm not convinced AI will be good for humanity overall. It's enabling:

  • Surveillance at unprecedented scale (we're being watched constantly)
  • Misinformation that's impossible to detect (deepfakes, AI-generated content)
  • Job displacement faster than we can retrain people (what do you do when your job disappears?)
  • Concentration of power in whoever controls the AI (a few companies control everything)
  • A race to the bottom where everyone must use AI or become obsolete (no choice)

The Renaissance gave us science and art. It also gave us industrialized warfare and colonialism. Good and bad, always mixed together.

The difference? We're moving faster this time, with less time to correct course. And the people building AI have profit motives, not humanity's best interests. That's concerning.

# What This Means for People My Age

I'm 17. My entire career will happen in a world where AI is normal. That's wild to think about. I'll never know what it was like before.

Right now I'm juggling being SPL of my Scout troop, mountain bike team, AP Physics (which is fun but super hard, but it's actually rewarding to learn though), and French class. All of this while trying to figure out what I want to do with my life in a world where AI is changing everything. It's a lot.

Things I'm certain about:

  • "AI literacy" will be as basic as computer literacy (you'll need it)
  • Jobs will require human+AI collaboration, not one or the other (it's both or nothing)
  • The ability to use AI effectively will be the new divide (like the digital divide, but worse)

Things I'm uncertain about (and honestly worried about):

  • Which jobs will still exist in 10 years (will programming even be a job?)
  • Whether "AI-proof" skills are real or just hopium (probably hopium)
  • If universal basic income becomes necessary (maybe? probably?)
  • What happens when AI is better than humans at everything (then what?)

What concerns me most (the real worries):

  • Most people don't understand how AI works, just that it "works" (including me, honestly)
  • We're building dependencies before understanding the risks (putting the cart before the horse)
  • The tech industry controls the narrative about AI being "helpful" (they have a vested interest)
  • Skills are being lost faster than we're adapting (we're forgetting how to do things)
  • Nobody's asking if we should do this, just how fast we can (no one's hitting pause)

# The Renaissance Metaphor Breaks Down

One key difference: the printing press was a tool. It didn't get better on its own. Once you had a printing press, it stayed the same. The technology was stable.

AI improves exponentially. GPT-4 is smarter than GPT-3. GPT-5 will be smarter than GPT-4. There's no ceiling in sight. It just keeps getting better, faster.

The Renaissance eventually stabilized. Society adjusted. New norms emerged. Things settled into a new normal.

AI might never stabilize. We might be in permanent revolution, where every few years a new capability emerges and reshapes everything again. How do you plan for that? How do you build a career when the rules keep changing?

Renaissance: 1400-1600 (200 years of change)
Industrial Revolution: 1760-1840 (80 years of change)
Digital Revolution: 1950-2000 (50 years of change)
AI Revolution: 2020-??? (ongoing, accelerating)

What happens when change is faster than human adaptation? We might be about to find out.


Reading "The Structure of Scientific Revolutions" for AP Physics (yes, really) and Kuhn talks about how paradigm shifts happen when the old way of thinking can't explain new observations. Feels relevant here. We're definitely in a paradigm shift, whether we like it or not.

Also, my Scout troop is planning a camping trip and I'm supposed to be organizing it as SPL but I'm writing this instead. Priorities, I guess.