4 min read

A Pragmatic Developer’s Take on ChatGPT

I came across this post the other day that perfectly captured something I’ve felt for years but could never quite articulate:

Coding is like taking a lump of clay and slowly working it into the thing you want it to become. It is this process, and your intimacy with the medium and the materials you’re shaping, that teaches you about what you’re making – its qualities, tolerances, and limits – even as you make it. You know the least about what you’re making the moment before you actually start making it. That’s when you think you know what you want to make. The process, which is an iterative one, is what leads you towards understanding what you actually want to make, whether you were aware of it or not at the beginning. Design is not merely about solving problems; it’s about discovering what the right problem to solve is and then solving it. Too often we fail not because we didn’t solve a problem well but because we solved the wrong problem.

When you skip the process of creation you trade the thing you could have learned to make for the simulacrum of the thing you thought you wanted to make. Being handed a baked and glazed artefact [sic] that approximates what you thought you wanted to make removes the very human element of discovery and learning that’s at the heart of any authentic practice of creation. Where you know everything about the thing you shaped into being from when it was just a lump of clay, you know nothing about the image of the thing you received for your penny from the vending machine

Aral Balkan (@aral@mastodon.ar.al)

As someone who’s been writing code for more than 25 years, this resonated deeply. The act of creating software—and discovering the right approach through iteration—is central to how I work. You learn by shaping. You gain clarity through the process.

It also got me thinking about the growing divide around AI tools in software development. Some people love how quickly they can ship by feeding instructions into Cursor to build entire features. Others take a hard stance against using AI at all.

I fall somewhere in the middle.

I’ve found that taking a “never” stance on tools—especially ones changing how we work—is usually shortsighted. Yes, there are real problems with AI: ethical, environmental, economic. And yes, it’s going to be harder to train new engineers if companies stop hiring juniors.

But I also live in the real world. Software is not just what I love doing—it’s how I provide for my family. It would be foolish to ignore how my industry is evolving.

So, how do I use AI tools like ChatGPT?

I’m Dictating Something to You, Help Organize My Thoughts

Looping is my side project. Between a full-time job and family life, my time to work on it is limited. So I try to make every hour count.

At the start of each new feature set, I’ll often start a conversation with ChatGPT—not to write the code, but to plan. I’ll talk through rough ideas, sketch out requirements, and explore different ways to structure the work.

Sometimes, I literally dictate my thoughts. While folding laundry or driving, I’ll start voicing what I’m thinking about building. What the goals are. What trade-offs I see. What questions I’m still wrestling with.

The key here is that I’m not asking ChatGPT to be clever. I’m just asking it to organize my thoughts. Not summarize them. Not invent anything. Just help me take a mental pile of ideas and turn it into a structured outline.

Then when I do get to my desk, I can pick up right where I left off, already in motion.

You’re a Principal Engineer at Looping

Once I have some structure, I treat ChatGPT as a stand-in for a very senior peer. I brain dump everything I’m planning to implement—architecture, naming, constraints—and ask it questions.

The code it generates is mostly throwaway. I treat it like pseudo code. Even when I give it examples of my own patterns, it rarely uses them correctly. That’s fine. I’m not asking it to write production-ready code. I’m using it to think through design.

What I do rely on is the iterative back-and-forth: proposing a direction, challenging it, refining it. It helps me spot dead ends quickly. It helps me recognize when something feels too complex or drifts from my core principles.

Once I’ve landed on a direction, I’ll ask it to generate a GitHub issue description. By that point, I’ve usually settled on the requirements, approach, naming, and structure—so this helps me re-establish context quickly the next time I sit down to work, even if it’s days later.

For side projects, keeping ramp-up time low is essential. This has been one of the most useful ways ChatGPT helps me stay productive.

You’re a Peer Providing a Code Review

After I’ve written the code myself, I’ll often return to the same ChatGPT thread. If something didn’t turn out the way I expected, I’ll explain what happened, what I learned, and how I changed my mind.

Sometimes I’ll paste in the actual implementation to compare against the earlier plan. This helps when I’m building adjacent features or want to reflect on decisions. I’ll also ask for comments or suggestions on the code—not because I’ll blindly follow them, but because it’s useful to get another perspective, even if it’s synthetic.

ChatGPT becomes a second set of eyes—available 24/7, non-judgmental, and surprisingly helpful at spotting inconsistencies or asking questions I hadn’t considered.

You’re a Helpful Assistant

At the end of the day, I’m the one making the decisions.

ChatGPT helps me sharpen my approach, reduce waste, speed up the transition from idea to execution, and lower the overhead of context switching. But I’m never handing over the clay. I’m still the one shaping it.

And that, I think, is the key.

You don’t have to give up your creativity, design, or exploration to benefit from AI tools. You just have to be thoughtful about how you use them—and what role you want them to play.

Using AI doesn’t mean outsourcing the hard parts. It means bringing a thoughtful assistant into the room—one that helps you work faster, but still lets you do the thinking.