Hacker News new | past | comments | ask | show | jobs | submit
LLMs can easily output overwhelming quantities of code. Junior devs couldn't really do that, not consistently.

Scale/quantity matter.

This industry is not mature enough for 1000x the bad code we have now. It was barely hanging on with 1x bad code.

Yeah. Due diligence is exponentially more important with something like Claude because it is so fast. Get lazy for a few hours and you've easily added 20K LOC worth of technical debt to your code base, and short of reverting the commits and starting over, it'll not be easy to get it to fix the problems after the fact.

It's still pretty fast even considering all the coaxing needed, but holy crap will it rapidly deteriorate the quality of a code base if you just let it make changes as it pleases.

It very much feels like how the most vexing enemy of The Flash is like just some random ass banana peel on the road. Raw speed isn't always an asset.

The cost of reverting the commits and starting over is not so high though. I find it is really good for prototyping ideas that you might not have tried to do previously.
It's cheap only if this happens shortly after the bad design mistakes, and there aren't other changes on top of them. Bad design decisions ossify fairly quickly in larger projects with multiple contributors outputting large volumes of code. Claude Code's own "game engine" rendering pipeline[1] is a good example of an almost comically inappropriate design that's likely to be some work to undo now that it's set.

[1] https://spader.zone/engine/

{"deleted":true,"id":47292377,"parent":47292275,"time":1772924970,"type":"comment"}