Hacker News new | past | comments | ask | show | jobs | submit
LLMs can build anything. The real question is what is worth building, and how it’s delivered. That is what is still human. LLMs, by nature of not being human, cannot understand humans as well as other humans can. (See every attempt at using an LLM as a therapist)

In short: LLMs will eventually be able to architect software. But it’s still just a tool

> LLMs can build anything.

This is only possibly true if one of two things are true:

1. All new software can be made up of of preexisting patterns of software that can be composed. ie: There is no such thing as "novel" software, it's all just composition of existing software.

2. LLMs are capable of emergent intelligence, allowing them to express patterns that they were not trained on.

I am extremely skeptical that either of these is true.

What is the use of software eng/architect at that point? It's a tool, but one that product or C levels can use directly as I see it?
Yes, for building something

But for building the right thing? Doubtful.

Most of a great engineer’s work isn’t writing code, but interrogating what people think their problems are, to find what the actual problems are.

In short: problem solving, not writing code.

loading story #47396089
A software engineer will be a person who inspects the AI's work, same as a building inspector today. A software architect will co-sign on someone's printed-up AI plans, same as a building architect today. Some will be in-house, some will do contract work, and some will be artists trying to create something special, same as today. The brute labor is automated away, and the creativity (and liability) is captured by humans.
> It's a tool, but one that product or C levels can use directly as I see it?

Wait, I thought product and C level people are so busy all the time that they can’t fart without a calendar invite, but now you say they have time to completely replace whole org of engineers?

FWIW I find LLMs to be excellent therapists.

The commercial solutions probably don't work because they don't use the best SOTA models and/or sully the context with all kinds of guardrails and role-playing nonsense, but if you just open a new chat window in your LLM of choice (set to the highest thinking paid-tier model), it gives you truly excellent therapist advice.

In fact in many ways the LLM therapist is actually better than the human, because e.g. you can dump a huge, detailed rant in the chat and it will actually listen to (read) every word you said.

Please, please, please don’t make this mistake. It is not a therapist. At best, it might be a facsimile of a life coach, but it does not have your best interests in mind.

It is easy to convince and trivial to make obsequious.

That is not what a therapist does. There’s a reason they spend thousands of hours in training; that is not an exaggeration.

Humans are complex. An LLM cannot parse that level of complexity.

loading story #47395850
loading story #47395976