Hacker News new | past | comments | ask | show | jobs | submit
AI being capable of doing anything doesn’t necessarily mean there will be no role for humans.

One thing that isn’t clear is how much agency AGI will have (or how much we’ll want it to have). We humans have our agency biologically programmed in—go forth and multiply and all that.

But the fact that an AI can theoretically do any task doesn’t mean it’s actually going to do it, or do anything at all for that matter, without some human telling it in detail what to do. The bull case for humans is that many jobs just transition seamlessly to a human driving an AI to accomplish similar goals with a much higher level of productivity.

Self-chosen goal, impetus for AGIs is a fascinating area. I'm sure there are people working on and trying things in that direction already a few years ago. But I'm not familiar with publications in that area. Certainly not politically correct.

And worrysome because school propaganda for example shows that "saving the planet" is the only ethical goal for anyone. If AGIs latch on that, if it becomes their religion, humans are in trouble. But for now, AGI self-chosen goals is anyone's guess (with cool ideas in sci-fi).