Discussion about this post

User's avatar
Niko's avatar
Oct 15Edited

Warhammer 40k coming early. We'll all be tech-priests soon 🙏.

Thanks for another insightful article! I wonder how you think senior engineers should prepare for the near- to medium-term future.

I have ~12yr professional experience from data-engineering to web development. I personally haven't experienced much of an increase in productivity from AI coding tools. Outside of small scripts and starting points for things you shouldn't be implementing yourself anyways (e.g. JSON float parsing), most of my experiments with AI coding tools end in frustration. When I'm in a mature code base that I am familiar with, then I already have a couple of solutions in my head. Typing them out is not the problem. On the contrary, the process itself forces me to interact with the code base and shows me which of the existing patterns make this new implementation difficult. I can then choose a different approach or refactor first. As a side effect I become intimately familiar with that code base, and that familiarity is where the real productivity gains come from. This is something that most managers (that I've met) under-appreciate, causing them to under-value retention and team stability. You are at your most useless when you inherit a legacy code base. With AI written code, it always feels that way.

During the early stages of your engineering career, you are mostly trying to get the thing to work at all. In that stage, AI can help, but I'm worried that it also robs you of the opportunity to think through hard problems yourself. However, as a senior you should be choosing from several working solutions and considering the impact to the rest of the system and the team. AI is bad at that. I was hoping that it could speed up prototyping of ideas, but that doesn't seem to work very well in larger code bases where a feature touches many different places. It just creates a huge mess. It never simplifies, always complicates, breaks stuff, and then capitulates. (Or, my personal favorite, claiming to fix a bug by placing a comment that says: "// Here we fix the bug that ...")

My personal take-away is to focus on strengthening my fundamentals, so that future me is in a better position to deal with the inevitable problems that the AI is unable to fix, or that the AI itself caused. To the extent that AI tools – mostly the chat kind – can help with that, I will use them. However, I don't feel that I have much of an edge over less experienced developers "driving" the AI coding agents, so I'm trying to avoid it as much as possible. Unfortunately, that is not a stance that I can comfortably publicize in my organization. It feels a bit like I'm undercover in a cult.

You probably know the Principle Skinner meme: "Am I so out of touch? No, it's the children who are wrong!" Well, nobody want's to be that guy, so I intend to check on the latest tools every few months. Would you say that I'm missing the boat by not committing to AI coding? How would you strike a good balance? Also, should I recommend to juniors to rely less on AI tools even if that means delivering less? That might hurt their performance review in the current climate.

Expand full comment
Ken Granville's avatar

I strongly agree with you, Denis. I believe there is a way to address this problem. It involves treating AI as a tool and assessing it on its real merits. Upon doing so, it should become clear to business leaders that there is a better path that is more human aligned.

Expand full comment
12 more comments...

No posts

Ready for more?