I was looking at the Bureau of Labor Statistics' Employment Projections, and the World Economic Forum's Future of Jobs Report, both of which were updated/released last year. Both of them had software developers on their lists of the fastest-growing jobs. The WEF predicted that Software and Applications Developers would see Global Net Growth of 57% between 2025 and 2030, while the BLS predicted that Software Developers would grow by 15.8% between 2024 and 2034.
It's easy to look at the numbers of layoff notices that have rocked the technology industry in the United States and decide, on that basis, that bureaucrats don't know anything, but of course they couldn't have known what choices people were actually going to make. One can fill out a survey or answer a questionnaire, and then have other factors come into play that result in different decisions being made. And, whether we like those decisions (or their impacts on our lives) or not, people are remunerated quite handsomely to make them.
And that's what came to mind when I saw this chart, in the Future of Jobs Report. It predicts that the share of work done by people, without recourse to automation or some sort of automated enhancement, will drop from 47% in 2025 to 33% in 2030, while the share of work done solely by automation grows from 22% to 34%.

And it's with these numbers in mind, I suspect, that people proclaim dire warning of what will happen to people who don't pivot into the jobs of the future (many of which pay less than the jobs of today). But this decline is no more a given than the increase in software development jobs was. This, too, is something that's going to be driven by the choices that people make. And maybe what's needed is for more people to be involved in those choices.
Now, Dario Amodei may be correct, and what he terms “powerful AI” may indeed create a “country of geniuses in a datacenter” that's just better at everything we do than we are. But until that comes about (and, given human history, likely even when it does) we have choices as to what we value. There's no reason to presume that it's impossible to direct where the future is going to go by adding some intentional design to the mix. I've said before that a question that bears answering is what new demand for human labor generative automation is going to create. But that buys into the hostile framing that posits that valuable work for humans will be relegated to the leftovers that automation, even if otherwise ubiquitous can't do. Maybe, as people, we'd all be better off if there was an active effort to find/create and then nurture roles that lie outside of the capacity of machines to do, and to start moving towards them now. (Normally, I go out of my way to avoid using the word "we," since it tends to be something of a weasel word, but here, maybe, enough of humanity is in the same boat that "we" makes sense.)
Because if it's undesirable that the World Economic Forum's prediction that out of every 100 workers, some "11 would be unlikely to receive the reskilling or upkskilling needed, leaving their employment prospects increasingly at risk," turns out to be true, perhaps the onus is to come up with something that those 11 can do that makes good use of the skills they already have.
Passively accepting the idea that automation is a bear coming for the job market, and so people's primary goal should be running faster than enough other people that the beast is satiated before it gets to them, is a recipe for disaster. The people the bear seeks to eat are unlikely to go down without a fight, and the conflict could wind up doing much more injury to the collective than the bear ever could. Here in the highly individualistic United States, this may be something of a heresy, but perhaps it's time that people decide to hang together before technology, and the incentive structures behind it, hang everyone separately.