Hacker News new | past | comments | ask | show | jobs | submit
Sorry I don’t mean to be snarky :) I think there is a happy middle ground between

“AI can’t do anything, it sucks!”

and

“AI is AGI and can do everything and your career is done for”

I teeter along the spectrum, and use with caution while learning new topics without expertise.

But I’ve been very surprised by LLMs in some areas (UI design - something I struggle with - I’ve had awesome results!)

My most impressive use case for an LLM so far (Claude 3.5 Sonnet) has been to iterate on a pseudo-3D ASCII renderer in the terminal using C and ncurses, where with the help of an LLM I was able to prototype this, and render an ascii “forest” of 32 highly detailed ascii trees (based off a single ascii tree template), with lighting and 3 axis tree placement, where you can move the “camera” towards and away from the forest.

As you move closer trees scale up, and move towards you, and overlapping trees don’t “blend” into one ascii object - we came up with a clever boundary highlighting solution.

Probably my favourite thing I’ve ever coded, will post to HN soon

I absolutely agree that it's a spectrum. I don't deny the usefulness of some LLMs (and I have used Claude 3.5 lately with great success -- it helped me iterate on some very annoying code with very niche library that I would have needed days to properly decipher myself). I got annoyed by the grandiose claims though so likely got a bit worked up.

And indeed:

> “AI is AGI and can do everything and your career is done for”

...this is the thing I want to stop seeing people even imply it's happening. An LLM helped you? Cool, it helped me as well. Stop claiming that programming is done for. It's VERY far from that.