Hacker News new | past | comments | ask | show | jobs | submit
As a principal engineer I feel completely let down. I've spent decades building up and accumulating expert knowledge and now that has been massively devalued. Any idiot can now prompt their way to the same software. I feel depressed and very unmotivated and expect to retire soon. Talk about a rug pull!

My experience is that people who weren't very good at writing software are the ones now "most excited" to "create" with a LLM.

Nah man. I understand the frustration, but this is a glass is half empty view.

You have decades of expert knowledge, which you can use to drive the LLMs in an expert way. Thats where the value is. The industry or narrative might not have figured that out yet, but its inevitable.

Garbage in, garbage out still very much applies in this new world.

And just to add, the key metric to good software hasn't changed, and won't change. It's not even about writing the code, the language, the style, the clever tricks. What really matters is how well does the code performs 1 month after it goes live, 6 months, 5 years. This game is a long game. And not just how well does the computer run the code, but how well can the humans work with the code.

Use your experience to generate the value from the LLMs, cuase they aren't going to generate anything by themselves.

> What really matters is how well does the code performs 1 month after it goes live, 6 months, 5 years.

After 40 years in this industry—I started at 10 and hit 50 this year—I’ve developed a low tolerance for architectural decay.

Last night, I used Claude to spin up a website editor. My baseline for this project was a minimal JavaScript UI I’ve been running that clocks in at a lean 2.7KB (https://ponder.joeldare.com). It’s fast, it’s stable, and I understand every line. But for this session, I opted for Node and neglected to include my usual "zero-framework" constraint in the prompt.

The result is a functional, working piece of software that is also a total disaster. It’s a 48KB bundle with 5 direct dependencies—which exploded into 89 total dependencies. In a world where we prioritize "velocity" over maintenance, this is the status quo. For me, it’s unacceptable.

If a simple editor requires 89 third-party packages to exist, it won't survive the 5-year test. I'm going back to basics.

I'll try again but we NEED to expertly drive these tools, at least right now.

I always tell Claude, choose your own stack but no node_modules.

What's missing is another LLM dialog between you and Claude. One that figures out your priorities, your non-functional requirements, and instructs Claude appropriately.

We'll get there.

loading story #47288277
loading story #47286889
loading story #47287330
loading story #47287250
loading story #47287390
loading story #47286328
loading story #47287730
> Any idiot can now prompt their way to the same software.

I must say I find this idea, and this wording, elitist in a negative way.

I don't see any fundamental problem with democratization of abilities and removal of gatekeeping.

Chances are, you were able to accumulate your expert knowledge only because:

- book writing and authorship was democratized away from the church and academia

- web content publication and production were democratized away from academia and corporations

- OSes/software/software libraries were all democratized away from corporations through open-source projects

- computer hardware was democratized away from corporations and universities

Each of the above must have cost some gatekeepers some revenue and opportunities. You were not really an idiot just because you benefited from any of them. Analogously, when someone else benefits at some cost to you, that doesn't make them an idiot either.

This is technically true in a lot of ways, but also intellectual and not identifying with what the comment was expressing. It's legitimately very frustrating to have something you enjoy democratized and feel like things are changing.

It would be like if you put in all this time to get fit and skilled on mountain bikes and there was a whole community of people, quiet nature, yada yada, and then suddenly they just changed the rules and anyone with a dirt bike could go on the same trails.

It's double damage for anyone who isn't close to retirement and built their career and invested time (i.e. opportunity cost) into something that might become a lot less valuable and then they are fearful for future economic issues.

I enjoy using LLMs and have stopped writing code, but I also don't pretend that change isn't painful.

loading story #47285245
> I don't see any fundamental problem with democratization of abilities and removal of gatekeeping.

It was very democratized before, almost anyone could pick up a book or learn these skills on the internet.

Opportunity was democratized for a very long time, all that was needed was the desire to put in the work.

OP sounds frustrated but at the same time the societal promise that was working for longest time (spend personal time specializing and be rewarded) has been broken so I can understand that frustration..

I'm mad about Ozempic. For years I toiled, eating healthy foods while other people stuffed their faces with pizza and cheese burgers. Everybody had the opportunity to be thin like me, but they didn't take that and earn it like me. So now instead of being happy about their new good fortune and salvaged health, I'm bitter and think society has somehow betrayed me and canceled promises.

/s, obviously I would hope except I've actually seen this sentiment expressed seriously.

loading story #47286472
loading story #47286449
loading story #47287180
loading story #47286296
loading story #47284944
loading story #47284760
loading story #47285316
loading story #47285093
loading story #47285454
> My experience is that people who weren't very good at writing software are the ones now "most excited" to "create" with a LLM.

I've been a tech lead for years and have written business critical code many times. I don't ever want to go back to writing code. I am feeling supremely empowered to go 100x faster. My contribution is still judgement, taste, architecture, etc. And the models will keep getting better. And as a result, I'll want to (and be able to) do even more.

I also absolutely LOVE that non-programmers have access to this stuff now too. I am always in favor of tools that democratize abilities.

Any "idiot" can build their own software tailored to how their brains think, without having to assemble gobs of money to hire expensive software people. Most of them were never going to hire a programmer anyway. Those ideas would've died in their heads.

> I also absolutely LOVE that non-programmers have access to this stuff now too. I am always in favor of tools that democratize abilities.

Programming was already “democratized” in the sense that anyone could learn to program for free, using only open-source software. Making everyone reliant on a few evil megacorporations is the opposite of democratization.

You know what they mean by that term, it's about building things without needing to put in the learning effort. I have bosses building small POCs via vibe coding, something they would not have done via learning to code and typing it manually.

It's the same sort of argument artists use when it comes to AI generated media, there obviously is a qualitative difference in the people now able to generate whatever they want versus needing to draw something by hand, so saying "they could've just learned to draw themselves" is not very convincing. People don't want to do that yet still get an output, and I see nothing wrong with that, and if you do, it's just another sort of gatekeeping, that the "proper" way is to learn it by hand.

Lastly, many, many open weight models exist.

What you bring to the table night be fine, but how long do you think you'll find emoloyers willing to still pay for this?

One thing is for sure LLMs will bring down down the cost of software per some unit and increase the volume.

But..cost = revenue. What is a cost to one party is a revenue to another party. The revenue is what pays salaries.

So when software costs go down the revenues will go down too. When revenues go down lay offs will happen, salary cuts will happen.

This is not fictional. Markets already reacted to this and many software service companies took a hit.

If AI completely erases the profession of software developer, I'll find something else to do. Like I can't in good faith ever oppose a technology just because it's going to make my job redundant, that would be insane.
loading story #47285090
loading story #47285001
I don't have an answer for this, and won't pretend to.

But my take on this is that accountability will still be a purely human factor. It still is. I recently let go of a contractor who was hired to run our projects as a Scrum/PM, and his tickets were so bad (there were tickets with 3 words in them, one ticket was in the current sprint, that was blocked by a ticket deep in the backlog, basic stuff). When I confronted him about them, he said the AI generated them.

So I told him that:

1. That's not an excuse, his job is to verify what it generated and ensure it's still good.

2. That actually makes it look WORSE, that not only did he do nearly 0 work, that he didn't even check the most basic outputs. And I'm not anti-AI, I expressly said that we should absolutely use AI tools to accelerate our work. But that's not what happened here.

So you won't get to say (at least I think for another few years) "my AI was at fault" – you are ultimately responsible, not your tools. So people will still want to delegate those things down the chain. But ultimately they'll have to delegate to fewer people.

loading story #47285229
loading story #47285224
>What you bring to the table night be fine, but how long do you think you'll find emoloyers willing to still pay for this?

I'm assuming that the software factory of the future is going to need Millwrights https://en.wikipedia.org/wiki/Millwright

But, builders are builders. These tools turn ideas into things, a builders dream.

Just sold a house/moved out after being laid off in mid-January from a govt IT contractor(there for 8 great years and mostly remote). I started my UX Research, Design and Front End Web Design coding career in 2009, but now I think it's almost a stupid go nowhere vanishing career, thanks to AI.

I think much like you that AI is and will just continue to destroy the economy! At least I got to sell a house and make a profit--stash it away for when the big AI market crash happens (hopefully not a 2030 great depression tho). As then it's a down market and buying stocks, bitcoin and houses is always cheaper.

Any given system will still need people around to steer the AI and ensure the thing gets built and maintained responsibly. I'm working on a small team of in-house devs at a financial company, and not worried about my future at all. As an IC I'm providing more value than ever, and the backlog of potential projects is still basically endless- why would anyone want to fire me?
loading story #47285123
loading story #47285475
"One thing is for sure LLMs will bring down down the cost of software per some unit and increase the volume.

But..cost = revenue."

That is Karl Marx's Labor theory of value that has been completely disproven.

You don't charge what it costs to build something, you charge the maximum the customer is willing to pay.

loading story #47289342
loading story #47285897
> I also absolutely LOVE that non-programmers have access to this stuff now too. I am always in favor of tools that democratize abilities.

Here's the other edge of that sword. A couple back-end devs in my department vibe-coded up a standard AI-tailwind front-end of their vision of revamping our entire platform at once, which is completely at odds with the modular approach that most of the team wants to take, and would involve building out a whole system based around one concrete app and 4 vaporware future maybe apps.

And of course the higher-ups are like “But this is halfway done! With AI we can build things in 2 weeks that used to six months! Let’s just build everything now!” Nevermind that we don’t even have the requirements now, and nailing those down is the hardest part of the whole project. But the higher-ups never live through that grind.

This scenario is not new with AI at all though? 14 years ago I watched a group of 3 front-end devs spin up a proof of concept in ember.js that has a flashy front end, all fake data, and demo it to execs. They wowed the execs and every time the execs asked "how long would it take to fix (blank) to actually show (blank)?" the devs hit f12, inspect element, and typed in what they asked for and said "already done!".

It was missing years of backend and had maybe 1/20th feature parity with what we already had and it would have, in hindsight, been literally impossible to implement some of the things we would need in the future if we had went down that path. But they were amazed by this flashy new thing that devs made in a weekend that looked great but was actually a disaster.

I fail to see how this is any different than what people are complaining about with vibe coded LLM stuff a decade and a half later now? This was always being done and will continue to be done; it's not a new problem.

loading story #47290556
It reemphasizes the question of importance. Would a user accept their data needing a AI implementation of a ("manual") migration and their flow completely changing? Does reliability to existing users even matter in the companies plans?

If it isn't a product that needs to solve problems reliably over time then it was kind of silly to use a DBA that cost twice the Backend engineer and only handled the data niche. We progressed from there or regressed from there depending on why we are developing software.

The models will not keep betting better. We have pased "peak LLM" already, by my estimate. Some of the parlour tricks that are wrapped around the models will make some incremental improvements, but the underlying models are done. More data, more parameters, are no longer doing to do anything.

AI will have to take a different direction.

This is really interesting to me; I have the opposite belief.

My worry is that any idiot can prompt themselves to _bad_ software, and the differentiator is in having the right experience to prompt to _good_ software (which I believe is also possible!). As a very seasoned engineer, I don't feel personally rugpulled by LLM generated code in any way; I feel that it's a huge force multiplier for me.

Where my concern about LLM generated software comes in is much more existential: how do we train people who know the difference between bad software and good software in the future? What I've seen is a pattern where experienced engineers are excellent at steering AI to make themselves multiples more effective, and junior engineers are replacing their previous sloppy output with ten times their previous sloppy output.

For short-sighted management, this is all desirable since the sloppy output looks nice in the short term, and overall, many organizations strategically think they are pointed in the right direction doing this and are happy to downsize blaming "AI." And, for places where this never really mattered (like "make my small business landing page,") this is an complete upheaval, without a doubt.

My concern is basically: what will we do long term to get people from one end to another without the organic learning process that comes from having sloppy output curated and improved with a human touch by more senior engineers, and without an economic structure which allows "junior" engineers to subsidize themselves with low-end work while they learn? I worry greatly that in 5-10 years many organizations will end up with 10x larger balls of "legacy" garbage and 10x fewer knowledgeable people to fix it. For an experienced engineer I actually think this is a great career outlook and I can't understand the rug pull take at all; I think that today's strong and experienced engineer will be command a high amount of money and prestige in five years as the bottom drops out of software. From a "global outcomes" perspective this seems terrible, though, and I'm not quite sure what the solution is.

loading story #47285154
loading story #47284811
loading story #47285844
loading story #47284958
loading story #47287013
loading story #47284574
loading story #47286896
I’m with you here.

I grew up without a mentor and my understanding of software stalled at certain points. When I couldn’t get a particular os API to work, in Google and stack overflow didn’t exist, and I had no one around me to ask. I wrote programs for years by just working around it.

After decades writing software I have done my best to be a mentor to those new to the field. My specialty is the ability to help people understand the technology they’re using, I’ve helped juniors understand and fix linker errors, engineers understand ARP poisoning, high school kids debug their robots. I’ve really enjoyed giving back.

But today, pretty much anyone except for a middle schooler could type their problems into a ChatGPT and get a more direct answer that I would be able to give. No one particularly needs mentorship as long as they know how to use an LLM correctly.

loading story #47284159
loading story #47284374
loading story #47284841
loading story #47287095
loading story #47285951
loading story #47288572
loading story #47284467
As a Principal SWE, who has done his fair share of big stuff.

I'm excited to work with AI. Why? Because it magnifies the thing I do well: Make technical decisions. Coding is ONE place I do that, but architecture, debugging etc. All use that same skill. Making good technical decisions.

And if you can make good choices, AI is a MEGA force multiplier. You just have to be willing to let go of the reins a hair.

loading story #47284594
loading story #47284843
I consider myself very good at writing software. I built and shipped many projects. I built systems from zero. Embedded, distributed, SaaS- you name it.

I'm having a lot of fun with AI. Any idiot can't prompt their way to the same software I can write. Not yet anyways.

loading story #47286027
loading story #47286845
loading story #47286812
loading story #47285726
loading story #47285874
IMHO any idiot can create a piece of crap. It takes experience to create good software. Use your experience Luke! Now you have a team of programmers to create what ever you fancy! Its been great for me, but I have only been programming C++ for 36 years.
Same here, although hopefully won't be retiring soon.

What's missing from this is that iconic phrase that all the AI fans love to use: "I'm just having fun!"

This AI craze reminds me of a friend. He was always artistic but because of the way life goes he never really had opportunity to actively pursue art and drawing skills. When AI first came out, and specifically MidJourney he was super excited about it, used it a lot to make tons and tons of pictures for everything that his mind could think of. However, after awhile this excitement waned and he realized that he didn't actually learn anything at all. At that point he decided to find some time and spend more time practicing drawing to be able to make things by himself with his own skills, not by some chip on the other side of the world and he greatly improved in the past couple of years.

So, AI can certainly help create all the "fun!!!" projects for people who just want to see the end result, but in the end would they actually learn anything?

loading story #47284503
loading story #47286972
loading story #47289043
loading story #47286799
loading story #47287159
loading story #47286060
loading story #47286545
loading story #47287624
loading story #47287111
loading story #47290331
> As a principal engineer I feel completely let down. I've spent decades building up and accumulating expert knowledge and now that has been massively devalued. Any idiot can now prompt their way to the same software. I feel depressed and very unmotivated and expect to retire soon. Talk about a rug pull!

Really?

The vibe coders are running into a dark forest with a bunch of lobsters (OpenClaw) getting lost and confused in their own tech debt and you're saying they can prompt their way to the same software?

Someone just ended up wiping their entire production database with Claude and you believe that your experience is for nothing, towards companies that need stable infrastructure and predictability.

Cognitive debt is a real thing and being unable to read / write code that is broken is going to be an increasing problem which experienced engineers can solve.

Do not fall for the AI agent hype.

loading story #47284418
loading story #47286895
loading story #47287224
loading story #47284993
loading story #47286614
loading story #47285724
loading story #47285885
loading story #47286332
loading story #47287376
loading story #47286159
loading story #47285945
loading story #47287084
loading story #47286412
loading story #47285833
loading story #47286319
loading story #47286835
loading story #47284985
loading story #47286906
loading story #47286334
loading story #47285132
loading story #47285738
I urge you to actually try these tools. You will very quickly realize you have nothing to worry about.

In the hands of a knowledgeable engineer these tools can save a lot of drudge work because you have the experience to spot when they’re going off the rails.

Now imagine someone who doesn’t have the experience, and is not able to correct where necessary. Do you really think that’s going to end well?

loading story #47285371
loading story #47287359
loading story #47286982
loading story #47286499
loading story #47286563
loading story #47284738
loading story #47284755
loading story #47287037
No offense but you sound more like a “principle coder”, not a principle engineer. At least in many domains and orgs, Most principal engineers are already spending most their time not coding. But -engineering- still take sip much or most of their time.

I felt what you describe feeling. But it lasted like a week in December. Otherwise there’s still tons of stuff to build and my teams need me to design the systems and review their designs. And their prompt machine is not replacing my good sense. There’s plenty of engineering to do, even if the coding writes itself.

loading story #47284391
loading story #47286919
loading story #47285208
loading story #47288433
loading story #47284888
loading story #47287093
loading story #47286905
loading story #47286879
Really? I love LLMs because I can't stand the process of taking the model in my brain and putting it in a file. Flow State is so hard for me to hit these days.

So now I spec it out, feed it to an LLM, and monitor it while having a cup of tea. If it goes off the rails (it usually does) I redirect it. Way better than banging it out by hand.

loading story #47285217
loading story #47285772
loading story #47285770
loading story #47285442
loading story #47285226
loading story #47284500
What I keep hearing is that the people who weren't very good at writing software are the ones reluctant to embrace LLMs because they are too emotionally attached to "coding" as a discipline rather than design and architecture, which are where the interesting and actually difficult work is done.
loading story #47284430
loading story #47286114
On the bright side, working in tech between 2006 and 2026 means you should be extremely wealthy and able to retire comfortably.
loading story #47285482
loading story #47284363
loading story #47286781