Pixel Cat Logo
Overview

Do Electric Sheep Dream of JIRA

December 30, 2025
5 min read

The more I use AI tools for coding, the more one question keeps surfacing. What happens when writing code stops being the job?

Not “what happens when AI takes our jobs” (the tired refrain). Something more specific. What happens when the act of writing code becomes a smaller and smaller slice of what it means to be a software developer?

Gutenberg Has Entered the Chat

Before the printing press, books were copied by hand. Scribes spent years mastering the craft: precise penmanship, accuracy across hundreds of pages. It was skilled, essential work.

Then Gutenberg came along, and the bottleneck moved.

The world still needs books; the demand even grew. But the scribes’ craft was no longer the bottleneck. The industry moved on: some became printers, some found niches in decorative calligraphy; most found other work entirely.

The scribes who struggled most were likely the ones who thought the job was penmanship. One monk, Johannes Trithemius, wrote a whole paper arguing that printed books would “never be the equivalent of handwritten codices.” It was distributed via the printing press.

The job was never penmanship. The job was books, it was sharing the final product. Penmanship was just the mechanism. Artisanal calligraphy still exists, though it doesn’t scale. And scale is where the work went.

Which is why in the back of my mind I keep wondering, will coding go the way of penmanship?

Was Typing the Hard Part?

The role on the other side of this shift looks familiar: understanding how pieces fit together, breaking work into chunks that can be executed (by AI or otherwise), reviewing output, catching the subtle bugs that slip through.

Architecture and design. The parts that happen before and after fingers hit keyboard.

The baseline expectation shifts. Less time typing, more time knowing what to type. You need to hold the whole system in your head (or at least know which parts you don’t understand and why that matters).

The junior developer role, as it exists today, gets awkward. If the entry-level task is “write this function” or “debug this crash” and an AI can do that, what’s the on-ramp? Perhaps something like “review what the AI wrote” or “design the system the AI will build.” Something like a universal reset. The rules of the game are being rewritten, and nobody knows what they’ll look like yet. We can only try to play along as it unfolds.

For the Love of Code

Some people got into software because they love writing code. The puzzle of it. The satisfaction of a clean implementation. The flow state of building something line by line.

That’s valid. There’s an identity question lurking here. If you didn’t write the code, are you still a developer? The output works. It ships. Does it matter whose fingers typed it?

It’s been fortunate that loving the craft and getting paid for it were so closely aligned; as Gen AI becomes more prevalent and capable, this alignment may start to drift.

Companies aren’t paying for code. They’re paying for products, features, problems solved. Code was the mechanism. For decades, it was the only mechanism. If you wanted software, someone had to write it. That’s changing.

The craft won’t disappear, though the ratio shifts: fewer people writing code, more software getting shipped. In niches where the training data is light (esoteric languages, novel problem spaces, highly specialized domains), the old way persists longer; the tools just aren’t as useful there yet. Progress marches on, and it may only be a question of when, not if.

Here be Dragons

There’s a type of engineer who thrives when dropped into an unfamiliar codebase. Where some see dragons, these folks see a puzzle. Questionable documentation. Interesting architecture choices. Original authors long gone. They start digging. They form hypotheses. They make it better without rewriting everything from scratch.

These engineers will feel right at home as Gen AI adoption spreads.

Working with AI-generated code requires the same skill set. You didn’t write it. You don’t have the context the “author” had (because there wasn’t really an author). You need to understand it anyway, spot the problems, make targeted fixes (either making tweaks or instructing an AI to do so).

The mindset transfers directly: treat the code as something to understand and improve, not something to own and control.

The scribes who adapted to the printing press probably shared something similar. The book wasn’t theirs in the same way anymore. But they could still make it better.

Strangers in a Strange Codebase

On the other side are engineers who prefer the familiar. Their instinct when facing a messy codebase: “this needs to be rewritten.” Not because they can’t work with it, but because starting fresh feels safer. They’re more comfortable in code they’ve built themselves.

These tend to be the same folks who are slower to adopt AI tools.

Same underlying preference, different context. If your comfort comes from knowing exactly how the code works because you wrote it, AI-generated code feels unsettling. You can understand code you didn’t write. It just takes more effort, and the instinct is to avoid that friction.

Codebases grow. Teams change. The skill was always “work effectively with code you didn’t write.” AI just makes that unavoidable, all at once. The universal reset.

What Now?

I don’t have answers here. The rules are still being written, and I’m figuring them out same as everyone else.

If you love the craft of coding, there’s still a place for that (likely). Artisanal software will have its niche. But banking your career on it feels risky.

There’s no Voight-Kampff test for this. No clean way to distinguish “real developer” from “prompts effectively.” Maybe that distinction was never the point. The scribes who adapted to the printing press didn’t keep proving they could write by hand. They let go of that identity and found where they were still needed.

Summary (The Safer Bet)

Get comfortable with the parts that aren’t going away: system design, debugging someone else’s work, knowing what questions to ask, recognizing when the AI-generated solution is subtly wrong.

Become the person who can work with any code, regardless of who (or what) wrote it.

The printing press didn’t kill literacy. It demanded a different kind of literacy. Same thing is happening here.