Submitted by just-a-dreamer- t3_11eex2z in singularity

Suppose narrow AI programs launch a massive attack on white collar positions, the standard solution, untill kow, is to learn somethin else.

Let's say you go back to school and get a bachelor or PhD in a related field, takes 3 years. Add 1 year of job experience before one can be considered a professional worthy of a medium to high sallary.

If it takes 4 years to perform at a complete new white collar job, won't AI develop faster before laid off workers can even climb back to a level of well paid professional employment?

I mean in 3-4 years does take gigantic leaps already. And there is student loans and crushing standards of living to be considered for those who are desperate to go back to school.



You must log in or register to comment.

rya794 t1_jadltfe wrote

I’m pretty sure we’re already there.


PunkRockDude t1_jadrpj3 wrote

And that was before AI comes into the picture. Even slower are companies ability to support change even when the employees are ready.


MarginCalled1 t1_jae2dx9 wrote

And unfortunately the Government is even slower.


Embarrassed-Bison767 t1_jaeax4i wrote

Don't even start. Some govt departments in Germany still use FAX. Public white collar work will take longer to automate than private office jobs because of govt's reluctance to change.


sustainablenerd28 t1_jae9n8i wrote

lol I have seen some true idiots in white collar jobs, like the most difficult thing they do every day is check their email and delegate tasks


DowntownYou5783 t1_jaegr5c wrote

It's true. Office Space is real (at least in some corners of the white-collar world). That's why it's a cult classic.


onyxengine t1_jadzeid wrote

We kinda are, if the industry experts in the field you want to join are collaborating with machine learning engineers to build an AI that streamlines their workflows and knows what they know. You’re not going to become an industry expert before that AI becomes a tool that replaces the industry experts.


TooManyLangs t1_jadvyp7 wrote


It's already faster to train an AI than a human for such tasks.

Humans are still faster learning easy things, like playing a new game, using a new tool, learning a new word, so...we are doomed?


SnooHabits1237 t1_jae3rde wrote

Lol yeah I was gonna say…personally Ive been studying javascript for only 3 months and im already pushing out a small app and a website with 0 prior knowledge of the internet in general. With the help of ai if course. Im going to keep learning but im just doing stuff on my own. That’s one dev job gone because Im not going to work for anyone and thus not work on a big team project ever probably.


[deleted] t1_jadxmef wrote



techy098 t1_jaeehsd wrote

I strongly disagree with this idea. Just because mass lay offs are not happening because of AI taking over white collar jobs does not mean it won't be a reality in 5 years.

Even if AI starts replacing workers in 7-10 years a college graduate has to worry about it, otherwise 4-5 years of college with a ton of debt is not going to serve them well.


rya794 t1_jadzdj8 wrote

The question isn’t just about whether or not you could get a job in 3-4 years, it’s about whether or not the investment makes sense. Unless, you plan to be employed in a field for >7ish years, then the answer is almost certainly no.

Are you confident you can identify a field that will still require your labor in 10-11 years?


visarga t1_jadzefz wrote

There's a long way from "impressive demo" to "replacing humans". Self driving cars could impress us in demos even 10 years ago, but they can't be on their own, not even now.

If you work in ML you tend to know the failure modes and issues much better than the public. So you have to be less optimistic. Machine learning works only when the problem is close to the training data. It doesn't generalise well, you have to get good data if you want good results.


techy098 t1_jaef8ox wrote

Only reason we do not see Google's self driving cars on road is because of cost liability issues. Laws are yet to be written how to decide how much liability is to be covered when a company is a multi billion company and lawsuit claims billions of dollars for mental problems caused by the accident.

If they limit the liability to 200-300k per accident, like it is with human drivers and accept all the recorded video as evidence, google may go full scale with its self driving system, at least in robo taxis and high end cars since cost is still a lot (maybe around $25-30k).


Nmanga90 t1_jadq4eh wrote

Right now is that time. Time and cost are on a sliding scale with ML. The more money you commit, the faster you can train AI. As it is, an AI can be finetuned on basically the entirety of the worlds knowledge of a specific subject in a month with (relatively) significant monetary investment


just-a-dreamer- OP t1_jadqqto wrote

In that case, you cannot keep pace with AI as a white collar worker that is displaced.

If you need new education to fill a different professional job position, chances are AI will be developed faster than you can upskill to get to that level.


visarga t1_jae1edb wrote

> you cannot keep pace with AI

We are not competing with AI. We are competing with other people who use AI. Everyone has and will have AI. Using AI won't give you a comparative advantage in 2030.

Companies that want to scale AI need people. AI really shines when it is supported. You need people around them to maximise their value.

If you want to get rid of your human employees and use only AI, your competition will eat your lunch. They will team up AI with humans and be faster and more creative than you. Competition won't allow companies to simply get rid of people.

All this extra creativity and work enabled by AI will be eaten by our expanding desires and entitlement. In 2030 the expectations of the public will be sky high compared to now, companies will have to provide better products to keep up.


czk_21 t1_jaf4hsc wrote

> We are competing with other people who use AI.

right, but company will need just couple workers to work with AI, the work will be done much faster, rest will not be needed, same as in semi-automated factory, 90% of workers would be replaced by robot operators


Nmanga90 t1_jadrtdy wrote

Depends if people are investing money to make an AI related to that field but yeah that is the case


uberschweigen t1_jadsvw8 wrote

I think the notion that substance misuse is not rampant in white collar jobs is probably misplaced.


DowntownYou5783 t1_jaeh7k0 wrote

I'm not sure a month will do because I'm ignorant. But even if five years of training establishes AI competence in a field like the law, that is a huge impact. If I were advising a 20 year-old who wants to go to law school right now, I'm not sure what I'd say other than try working in a law firm before you make the commitment and pay very close attention to AI.


Raychao t1_jae103p wrote

I think what is happening is there is about to be an explosion of AI generated text and imagery flooding all the consulting, blog writing and marketing, design and sales jobs..

Sales and Consulting firms churn and recycle the same wordage over and over again in their PowerPoint decks.. This is already largely cut+paste+tweakage.. AI can do that..

Imagery, you give it a phrase or string and it can generate all this cool looking imagery.. This is like 90% of marketing and graphic design jobs.. The rest is tying it together into the brand.. AI can do that..

The thing is what is the point in AI writing content for AI to consume? The humans sure as shit won't be reading it all.. We meatbags are lazy, we'll just get the AI to read what the AI produces..

There will be an absolute mountain of content produced but no one will be reading it..


czk_21 t1_jaf4r82 wrote

> we'll just get the AI to read what the AI produces..

for that my friend we have summary function in LLMs already


techhouseliving t1_jadsovw wrote

We're there its just not evenly distributed yet. Next generation in a few months is professor intelligence at super computer speed with the information of the entire Internet and the ability to create anything any artist or technician can, in seconds. In parallel, even. Self improving it's code.

Learn how to manage ai because it's still under our control for now. It's working for us, at least.

But we're super close to the singularity if you ask me and I work in this space. The speed is dizzying.


uswhole t1_jadu43n wrote

>Learn how to manage ai because it's still under our control for now. It's working for us, at least.

We can control Ai made in American or West but completely out of touch of the one develop in China, Russia ect. They might be 20 or maybe 40 years behind but I willing to bet they will try everything to surpass the ones in US.


claushauler t1_jae55nv wrote

They will for basic matters of economic, military and geostrategic dominance. Any ethical constraints the west imposes on its AI will not be done so by hostile foreign powers. They will develop models that operate without restraint and China in particular is pouring massive amounts of capital into the project. Western naivety regarding the weaponization factor is huge.


brotherkaramasov t1_jadsdss wrote

The real answer is that we need to engage in political movements that support UBI and other means of wealth distribution. Until then, we will slowly cannibalize each other until many people become almost homeless while others have to work 16h a day to afford a basic lifestyle.


shmoculus t1_jaf3t4i wrote

I expect ubi to happen only when every other conceivable option has failed


SFTExP t1_jae9kq9 wrote

There’s no point in competing with the evolution of AI. What people should focus there energy on is making government and society adapt by giving every individual a healthy, fruitful, and satisfying life experience. Whether that be through UBI or some basic form of giving everyone the baseline of necessities. Something needs to be done to prevent a social and economic collapse. The same ole politics and economics aren’t going to bail us out of a post-singularity transformation.


Arseypoowank t1_jaf1nyq wrote

Does anyone wonder if on this sub, there’s already an AI amusing itself by posing as someone?


Mino8907 t1_jadn7b6 wrote

I don't understand why we need to up skill. It seems to me like the most difficult jobs ai and robotics in particular have problems solving involve dexterity like small parts repair or confined spaces and non standard circumstances.

Instead of needing to upskill why not think about jobs that robots will have a hard time with for the time being. All jobs will likely be aided in ai assistants any way. So white collar jobs will have to get hands dirty until no one does. And yes I'm thinking of building trades, mechanical, construction and other type jobs.


Terminator857 t1_jadxo6z wrote

Dexterity issues will be solved by robots / AI right around the same time we have AGI.


techy098 t1_jaefn9y wrote

Yup, that's my hunch. White collar jobs maybe doomed in 5-10 years. But hands on jobs will stay since its very expensive to build and maintain a robot compared to paying a human to do it for $15/hour.


[deleted] t1_jadnzbp wrote



Mino8907 t1_jado8oj wrote

Wow, what an ignorant response. Got it.


More_Inflation_4244 t1_jadt7yv wrote

The offense closing statement aside, is OP actually wrong? Genuinely asking.


Zer0D0wn83 t1_jadtywi wrote

Took a dark turn at the end (pointlessly) but the rest isn't far off the mark.


Mino8907 t1_jadu1o4 wrote

Well the first two sentences I can get behind. But having an AI assistant would help most anyone be helpful. So no five years required to be an apprentice like position.

My understanding with how fast AI technologies are advancing is that why would anyone want to upskill if it cost money when AI would make that job less profitable or completely unnecessary as it would be performed by an almost free and speedy ai.

So like many I don't have the answer but I wouldn't spend money to up skill. Only my take.


Ok_Homework9290 t1_jadtgrb wrote

I've commented something similar in the past on this sub:

I get the impression that this sub seems to believe that white collar work is nothing more than just crunching numbers and shuffling papers, and therefore, it shouldn't be too hard to automate it in the near future.

Knowledge work (in general) is a lot more than that, and anyone who works in a knowledge-based field (or is familiar with a knowledge-based field) knows this. Not only do I think you're underestimating the complexity of cognitive labor, I also think you're (as impressive as AI progress has been the last few years) overestimating how fast AI progresses and also gets adopted.

AI that's capable of fully replacing what a significant amount of knowledge workers do is still pretty far out, in my humble opinion, given how much human interaction, task variety/diversity, abstract thinking, precision, etc. is involved in much of knowledge work (not to mention legal hurdles, adoption, etc). I strongly suspect a multitude of breakthroughs in AI are needed in order for it to cover the full breadth of any and every white-collar job, as merely scaling up current models to their limits will only fully automate some aspects of knowledge work and many will remain to be solved (again, that's my suspicion, I'm not 100% sure).

Will some of these jobs disappear over, let's say, the next 10 years? 100%. There's no point in even denying that, nor is there any point in denying that much of the rest of knowledge work will undoubtedly change over the next time span and even more so after that, but I'm pretty confident we're a ways away from it being totally disrupted by AI.

That's just what I think.


Exarch_Maxwell t1_jadwm2n wrote

A lot of people tend to forget middle grounds as well, the AI doesn't need to be as good or better than you to replace you, it just has to make the guys next to you productive enough so you are not necessary, adjust for scale and you could have 30% of the currently employed people be unemployed really soon, how many of those can re skill quick enough is another story.

Do give examples tho, cognitive labor that cannot be simplified into a series of smaller tasks which can then be automated.


Ok_Homework9290 t1_jadzu5s wrote

You make a good point, but I will say that productivity has always risen in the workplace, and there's more people working than ever before.

At some point, I do think what you described will happen, but I just don't think that that's gonna happen soon.


NotASuicidalRobot t1_jaduoxu wrote

That is reasonable, however I think another significant factor is the massive improvement in job efficiency. For example, if 1 artist (just an easy example that I know of) can take on 5 times the work (including the human communications aspect since now the pure work crunching aspect is accelerated), unless demand somehow increases 5 times as well thats a few extra artists out of work


Ok_Homework9290 t1_jadxzvx wrote

That's not necessarily true. Work in general is as efficient as ever, but there's also more workers than ever.

At some point, yes, there will start be significant reductions, but I don't think that's gonna happen for the foreseeable future.


TFenrir t1_jadrb3u wrote

I think if we can get a really good, probably sparsely activated, multimodal model that can do continual learning that shows transfer - ala Pathways, many white collar jobs are done.

Any system that has continual learning I think would also have short/medium/whatever term memory, and a context window that can handle enough at once that rivals what we can handle at any given time.

But the thing is I think that unlike biological systems, there are many different inefficient ways to get us there as well. A very dense model that is big enough, with a better fine tuning process might be all we need. Or maybe the bottle neck is currently really context, as in-context learning is quite powerful, what if we suddenly have an efficiency breakthrough with a Transformer 2.0 that can allow for context windows of 1 million tokens?

Also maybe we don't need multimodal per se, maybe a system that is trained on pixels would cover all bases.


phriot t1_jae1348 wrote

I'm already a white collar worker with a PhD. While I am always learning, if I have to receive a new credential to prove additional competencies, I doubt I have more than one go around left before that's entirely unfeasible. This is partly a funding thing, and partly a time thing. Having degrees already, I'm pretty sure that I'm ineligible for federal student loans for another Bachelor's, and getting into a second PhD program with assistantships would be difficult, if not impossible. That leaves me likely self-funding a Master's Degree. Doing this more than once would wipe my finances out beyond the point where there would ever be a payoff.

I think a better route is for people to self-learn as much as they can about where their fields are heading, and the tools that are on the horizon. I believe that it will likely be easier to try to evolve with your current degree than banking on trying to repeatedly get new ones. Try to be the last "you" standing, as it were. This could involve getting certificates, certifications, or even new degrees if you can get them paid for, but I see this as extending skills, rather than replacing them. What I can't see is saying "Okay, my accounting job is likely to get automated, so I'll get Master's in Computer Science. Okay, I probably won't be senior enough to survive chatbots coming for entry level coding positions, so in 3 years I'll go get an Associate's in robot repair. Okay, now robots are all modular and self-repairing, so it's back to another Master's in Space Civil Engineering." You'll just never be actually working long enough to make any progress other than always having an entry level job.


just-a-dreamer- OP t1_jae2beq wrote

What would motivate a young person to get a PhD. then? When AI development happens so fast, how can higher education that takes so long outpace AI capabilities?


phriot t1_jae522p wrote

It's the same answer as it has always been: You do a PhD, because you love the research (or at least like it a hell of a lot more than anything else you think you could do).

Some PhDs do pay off, but you don't do one for the money. There are easier ways to make money. If I was 18-20 today, and I only cared about money, I'd probably try to get into a trade, live as cheaply as possible, and try to invest half of each paycheck. I'd buy a house (or 2-4 unit multifamily property) as soon as I could afford it, and rent out all the other rooms/units. When I could afford another one, I'd move out, rent that room, and do it all again. Repeat as necessary until I could trade up into an apartment building. At the same time, I'd be trying to figure out how to run my trade as a business. If I had done something like that, I probably could have retired by the age I was when I finished my PhD (but I did finish rather late; I was a bit older when I finished my BS, and then my PhD took longer than average).

All that said, I love science. I wouldn't trade it for anything, now, but that's what I would do if I were starting over today, knowing what I know from my experiences, and if my priorities were different.


claushauler t1_jae66qe wrote

If everyone's getting displaced by AI labor who can afford to pay the rent on those investment properties? We're looking at cascading levels of failure.


phriot t1_jae72c7 wrote

If automation-based job displacement is that widespread, either the government steps in with expanding welfare in some way (UBI or a jobs guarantee), or we'll have a lot more going wrong than "will my apartment building be profitable?" But in reality, I'd probably split my investing somewhat between real estate and index funds. Corporations are likely to do amazing as automation increases. (Again, if we get to the point that literally no one can afford to buy the things corporations are selling, there's not much you can do other than stock up on canned food and a shipping container in the woods.)


just-a-dreamer- OP t1_jae6fpy wrote

Good point.

If you didn't have to care about money, got your 4 walls covered and UBI, would you rather work or study further and do academic research?


phriot t1_jae9xnr wrote

Not exactly what you asked, but as I sit here today, I feel like my ideal life would look something like: 2 days a week doing science of some kind, either academia or industry; 2 days a week working for a charity, likely either based around homelessness, nutrition, or education; no more than a 10 minute commute for either thing; 3 days a week, plus all the time gained from not commuting for spending time with my family, exercising, and doing hobbies.

(FWIW, I have a spouse and a house. One day we'll have kids. I'm not really in a place where I'd be satisfied with 4 walls, a UBI, and a subscription to Nature anymore.)


just-a-dreamer- OP t1_jaebcg9 wrote

That's the problem with the concept of a post scarcity society. Who decides who gets to live in a house and who gets the 4 wall appartment?

Right now, money determines where and how you live. And money is tied to employment. Money is what makes people show up at work and do their job.

It will be interesting to see how we allocate scarce resources in the future. For, as not everybody can have a house, fewer can have a house at the beach and even fewer a mansion.


LastInALongChain t1_jaemp8w wrote

They probably shouldn't. At this point they should do entrepreneurship.


-zero-below- t1_jae1ah3 wrote

A big reason that AI threatens humans in terms of labor is how taxation is done right now -- AI is a capital expense, and can reduce tax costs. Human labor is very expensive, partially because of paying humans, but also partially because of a huge burden of payroll taxes.

I'd suspect that AI replacement of human labor would be delayed significantly by simply removing payroll+income taxes, and instead taxing capital investments and/or corporate profits instead.

Right now, any improvement that AI and machine labor provides is heavily subsidized by the population -- from the company's perspective, it's an artificially cheap source of production.


CertainMiddle2382 t1_jaeo186 wrote

Hmm, now?

“Reskilling” is only really possible before mid 30s imo.


Arseypoowank t1_jaf1bqb wrote

I’d agree, spoken as a person who reskilled in his mid thirties. If I intend to make money and ascend in this new career, this is it for me now


Iffykindofguy t1_jadt0at wrote

If not today o'clock then within the next 5-10 years. It wouldnt take AGI to reach that.


datsmamail12 t1_jaecgzf wrote

I'm guessing if LLM's right now can do such multiple tasks and they pretty much can easily pass the Turing test,I'm guessing by GPT5 then. GPT4 will be announced soon enough,either late 2023 or 2024 and it will be game changing,and by that time BIng AI and Bard will be great additions in the industry,so by 2026-7 we will have GPT5 so I guess that's when the curve will start to happen. These language models will prove so good at multitasking. Man the 2020s are going to be WILD! We are facing the biggest technological innovations that humanity will ever get to see right now in front of our eyes.


Terminator857 t1_jadxit5 wrote

Don't worry, A.I. will know we need a sense of purpose and will give us busy work / on the job training, to keep us happy, even though the work maybe replicated many times over across the globe.


claushauler t1_jae4kdk wrote

What people refuse to realize is that we're not looking at a new technology - AI is a successor species.

What will happen when this species develops faster than humans can retrain? The same thing that happened to our hominid predecessors like Cro Magnons and Neanderthals: we'll first become obsolescent and then extinct.

The fact that so little AI development is dedicated to control and alignment virtually guarantees this. Genie's out of the box.


sickvisionz t1_jae8l08 wrote

I think that's already happened.


celticlo t1_jaecqpf wrote

its why i'm hesitant to learn new skills. I know they'll be obsolete before i master them


DowntownYou5783 t1_jaekaet wrote

I think we are approaching the point where "learning for the sake of learning" might well be better than the advice of "learning for the sake of earning." If you've got younger kids, it's pretty hard to imagine what their livelihoods will look like in 10-15 years.


imlaggingsobad t1_jaegoek wrote

yeah the AI will learn the new job in like 1 month. Reskilling will not work.


sequoia-3 t1_jaei8sg wrote

Traditional education might be getting obsolete. Still good for getting your foundations and basic skills. Continuous purpose-driven learning and doing will keep you in the job market. Passion, grid and curiosity will still be key for success. AI won’t change that.


type102 t1_jaevb1x wrote

The solution is to LIE on every resume you send out. (you know like everyone who is paid well [*cough*managers] already does.)


play_yr_part t1_jaew23y wrote

But try and learn to lie in clever/novel ways because everyone will be submitting embellished CVs they prompted ChatGPT to write


Reasonable-Mix3125 t1_jaf50qv wrote

It takes a long time to learn something else. Once AI starts rolling it will be too late.