Hacker News new | past | comments | ask | show | jobs | submit
It used to be that you had to have a strong understanding of the underlying machine in order to create software that actually worked.

Things like cycle times of instructions, pipeline behavior, registers and so on. You had to, because compilers weren‘t good enough. Then they caught up.

You used to manage every byte of memory, utilized every piece of underlying machinery like the different chips, DMA transfers and so on, because that‘s what you had to do. Now it‘s all abstracted away.

These fundamentals are still there, but 99,9% of developers neither care nor bother with them. They don’t have to, unless they are writing a compiler or kernel, or just because it‘s fun.

I think what you‘re describing is also going to go away in the future. Still there, but most developers are going to move up one level of abstraction.

loading story #47396162
You can't "move up one level of abstraction" from computational complexity.