I have no idea what to specialize in, what skills I should master, or where I should be spending my time to build a successful career.
Seems like we’re headed toward a world where you automate someone else’s job or be automated yourself.
It's not encouraging from the point of view of studying hard but the evolution of work the past 40 years seems to show that your field probably won't be your field quite exactly in just a few years. Not because your field will have been made irrelevant but because you will have moved on. Most likely that will be fine, you will learn more as you go, hopefully moving from one relevant job to the next very different but still relevant job. Or straight out of school you will work in very multi-disciplinary jobs anyway where it will seem not much of what you studied matters (it will but not in obvious ways.)
Certainly if you were headed into a very specific job which seems obviously automatable right now (as opposed to one where the tools will be useful), don't do THAT. Like, don't train as a typist as the core of your job in the middle of the personal computer revolution, or don't specialize in hand-drawing IC layouts in the middle of the CAD revolution unless you have a very specific plan (court reporting? DRAM?)
The technical act of solving well-defined problems has traditionally been considered the easy part. The role of a technical expert has always been asking the right questions and figuring out the exact problem you want to solve.
As long as AI just solves problems, there is room for experts with the right combination of technical and domain skills. If we ever reach the point where AI takes the initiative and makes human experts obsolete, you will have far bigger problems than career.
One thing that isn’t clear is how much agency AGI will have (or how much we’ll want it to have). We humans have our agency biologically programmed in—go forth and multiply and all that.
But the fact that an AI can theoretically do any task doesn’t mean it’s actually going to do it, or do anything at all for that matter, without some human telling it in detail what to do. The bull case for humans is that many jobs just transition seamlessly to a human driving an AI to accomplish similar goals with a much higher level of productivity.
And worrysome because school propaganda for example shows that "saving the planet" is the only ethical goal for anyone. If AGIs latch on that, if it becomes their religion, humans are in trouble. But for now, AGI self-chosen goals is anyone's guess (with cool ideas in sci-fi).
I argue that CAD was a general solution - which still demanded people who knew what they wanted and what they were doing. You can screw around with excellent tools for a long time if you don't know what you are doing. The tool will give you a solution - to the problem that you mis-stated.
I argue that globalisation was a general solution. And it still demanded people who knew what they were doing to direct their minions in far flung countries.
I argue that the purpose of an education is not to learn a specific programming language (for example). It's to gain some understanding of what's going on (in computing), (in engineering), (in business), (in politics). This understanding is portable and durable.
You can do THAT - gain some understanding - and that is portable. I don't contest that if broader AGI is achieved for cheap soon, the changes won't be larger than that from globalisation. If the AGIs prioritize heading to Mars, let them (See Accelerando) - they are not relevant to you anymore. Or trade between them and the humans. Use your beginning of an understanding of the world (gained through this education) to find something else to do. Same as if you started work 2 years ago and want to switch jobs. Some jobs WILL have disappeared (pool typist). Others will use the AGIs as tools because the AGIs don't care or are too clueless about THAT field. I have no idea which fields will end up with clueless AGIs. There is no lack of cluelessness in the world. Plenty to go around even with AGIs. A self-respecting AGI will have priorities.
It doesn't matter if you are bad at using the tool if the AGI can just effectively use it for you.
From there it's a simple leap to the AGI deciding to eliminate this human distraction (inefficient, etc.)
Yet GPT doesn’t even get past step 1 of doing something unprompted in the first place. I’ll become worried when it does something as simple as deciding to start a small business and actually does the work.
also https://mashable.com/article/chatgpt-messaging-users-first-o...
Of course it's also yet another case where the AI takes over the creative part and leaves us with the mundane part...
Yes a new tool is coming out and will be exponentially improving.
Yes the nature of work will be different in 20 years.
But don’t you still need to understand the underlying concepts to make valid connections between the systems you’re using and drive the field (or your company) forward?
Or from another view, don’t we (humanity) need people who are willing to do this? Shouldn’t there be a valid way for them to be successful in that pursuit?