Hacker News new | past | comments | ask | show | jobs | submit
While I'm definitely concerned that AI is a massive driver of centralization of power, at least in theory being able to do far more things in the space of "things physics admits to be possible" is massively wealth enhancing. That is literally how we have gotten from the pre-industrial world to today.
Controversially I'd argue that there is likely an optimal and stable level of technological advancement which we would be wise to not to cross. That said, we are human so we will, I'd just rather it happened in a couple hundred years rather than a decade or two.

For example, it's hard to imagine an AI which gives us the capability to cure cancer, but doesn't give us the capability to create target super viruses.

Nick Bostrom's Vulnerable World Hypothesis more or less describes my own concerns, https://nickbostrom.com/papers/vulnerable.pdf

At some point we should probably try to resist the urge to pick balls out of the urn as we may eventually pull out a ball we don't want.

loading story #47683352