The approaching mass AI layoffs

Who’s up for a bit of paranoia today? I agree with the trends the writer is describing, but too much drama. The link to the opinion piece:

So, let’s begin. Indeed, plenty of people in the office works on performance assessments, invoiceable hours, net margin, etc. Heck, these costs are properly classified as overhead costs.

How many roles essentially consist of processing information and then presenting it to someone to make a decision? Now not only the process and report will be automated, but perhaps the decision as well. This will result in the great disemboweling of white-collar jobs.

I’m not worried about this because it’s true, but because there’s indeed the hazard that the “stock market” will reward AI adoption measured in fired employees. The balance sheet will look great for a few quarters. Next, there might be a correction, but 1, 2 or more years without a job is a high price for corporate trial & error.

This automation wave will kick millions of white-collar workers to the curb in the next 12 - 18 months. As one company starts to streamline, all of their competitors will follow suit. It will become a competition because the stock market will reward you if you cut headcount and punish you if you don’t. As one investor put it, “Sell anything that consists of people sitting at a desk looking at a computer.”

I like the name, seems appropriate.

I’ve started to call this displacement wave the Fuckening because that feels more visceral.

I don’t think the 20-50% decrease in employment the writer mentions will materialize. But, unemployment going up 2% is enough disruption.

Mid-career office workers will be fired in droves. Right now, there are about 70 million white-collar workers in the United States. Expect that number to be reduced substantially, by 20 – 50% in the next several years. Even a reduction of several million would be tectonic, and I fully expect it to go well beyond that level.

The opinion piece is centered in the US. There’s something which can be a couple of sneezes in the US but become a pneumonia in Switzerland. People trust the system here. And I spent some years in the university and the Swiss were a minority. Why? Because graduate education is a serious hit to your career income. Why spend years at the university earning less than an immigrant plastering gypsum in wall when you can earn the average or above with your business related EFZ/CFC? I guess this is the highest hazard from AI for Switzerland: people who think that they deserve better because they have always done what they’ve been told to do in life. And it only needs to be a minor increase in unemployment.

The social contract of ‘study hard, go to school, get a good job, live a decent life’ is about to be vaporized to smithereens…People are not going to take it well. Particularly educated people who think that they deserve better. That’s an ingredient for revolt.

Hopefully, the discontent will just be an increase in bitter middle-aged readers who believe they’re not valued enough at insideparadeplatz.ch or reddit/Switzerland. Honestly, I guess it will be higher distrust at whatever looks like foreign/immigrant. So, the Fuckening is a good name for this.

2 Likes

I think this is a question that we will have to tackle. I think the one thing that might help is inertia. It takes a few years for companies to adopt technology and integrate it. And a few years for AI to get better, so there’s some transition time.

2 Likes

This gave me the shivers a week or so back…

For years, AI had been improving steadily. Big jumps here and there, but each big jump was spaced out enough that you could absorb them as they came. Then in 2025, new techniques for building these models unlocked a much faster pace of progress. And then it got even faster. And then faster again. Each new model wasn’t just better than the last… it was better by a wider margin, and the time between new model releases was shorter. I was using AI more and more, going back and forth with it less and less, watching it handle things I used to think required my expertise.

Then, on February 5th, two major AI labs released new models on the same day: GPT-5.3 Codex from OpenAI, and Opus 4.6 from Anthropic (the makers of Claude, one of the main competitors to ChatGPT). And something clicked. Not like a light switch… more like the moment you realize the water has been rising around you and is now at your chest.

I am no longer needed for the actual technical work of my job. I describe what I want built, in plain English, and it just… appears. Not a rough draft I need to fix. The finished thing. I tell the AI what I want, walk away from my computer for four hours, and come back to find the work done. Done well, done better than I would have done it myself, with no corrections needed. A couple of months ago, I was going back and forth with the AI, guiding it, making edits. Now I just describe the outcome and leave.
Let me give you an example so you can understand what this actually looks like in practice. I’ll tell the AI: “I want to build this app. Here’s what it should do, here’s roughly what it should look like. Figure out the user flow, the design, all of it.” And it does. It writes tens of thousands of lines of code. Then, and this is the part that would have been unthinkable a year ago, it opens the app itself. It clicks through the buttons. It tests the features. It uses the app the way a person would. If it doesn’t like how something looks or feels, it goes back and changes it, on its own. It iterates, like a developer would, fixing and refining until it’s satisfied. Only once it has decided the app meets its own standards does it come back to me and say: “It’s ready for you to test.” And when I test it, it’s usually perfect.

3 Likes

I have a similar experience. I finish many evenings now leaving a job for the AI before I go to bed. And in the morning, I go to check on the results and usually the job is done.

Thanks for the thread split. I was derailing the other one :slight_smile:

Wonder if all of that was written by AI.

2 Likes

Haha! Very meta…

What no one seems to consider - at least those who can do something about it - is that people put out of work will still need money to be able to buy things. If AI is going to be doing all these jobs then the companies running those AI’s should give a % of their profits into a fund which will pay those now redundant workers an effective wage. Or else you bring in a universal salary for everyone funded by said AI profits.

When so many are unemployed then who will be left to buy the products or services those companies offer?
Those who remain employed will likely hoard their money for fear of joining their unemployed neighbours.

1 Like

[quote=“MedeaFleecestealer, post:8, topic:143447, full:true”]
What no one seems to consider - at least those who can do something about it - is that people put out of work will still need money to be able to buy things. [/quote]

This is the question that troubles me. I don’t think there is a clear answer. But we’ve seen that companies will push for profits and so may be it ends up as a giant social experiment.

We can’t even get tech companies to stop tax avoidance and pay their tax let alone get them to donate money. Trump is blocking and threatening retaliation on global tax avoidance measures and digital taxes seeking to tax Big Tech.

1 Like

Plumbers and electricians

I had the thoughts over a decade ago that the software industry is a big scam. Products which doesn’t really bring any new value, but why not creating another clone of X and pitch how much it’s better… silly model which was keeping a large % of working population employed. Now with the AI we’re going to an end of this “useless jobs/nonsense software evolution” industry. Perhaps we’ll see even more competition on the market for everything, but these competing products will be made by a handful of people each. Of course not all software made out there is on this useless-competition category, there are genuinely well made products which don’t have any serious competition, these will survive and will continue employ good developers.

Summarization, data transformation or visualization, is indeed another threat. Again, I don’t believe in the long term quality of the summaries, but instead of having 10 people you’d get 5 or 2 using AI.

I really wonder what will fill the void of jobs shortage so quickly, otherwise our “developed western civilization” will collapse. People can’t be left without hope, opportunities, especially young. Crime will develop faster than AI stock price

Good point. Excel and Word are basically the same as 20 years ago. The zip file containing XMLs.