You must log in or register to comment.

Sandbar101 t1_je2ddqi wrote

It only gets worse in real life I genuinely cannot get my head around it with my CompSci friends.


MNFuturist t1_je2isvz wrote

I've been a professional futurist for 10+ years helping my clients with emerging tech and trends, and the one constant across industries has been "... but it could never do my job." I get it though, if you spent your whole career getting really good at something, respected by your peers, earning a good living, etc., it's really difficult to accept that it could suddenly be automated (or even partially automated.) We're about to see a lot more of this in many areas where people felt "safe" and like they had a long time to adapt and now they don't. It's going to be rough. (Btw, I have no illusions that my career as a keynote speaker is safe.)


D_Ethan_Bones t1_je2m652 wrote

30 years ago: "Nobody can ever replace me as an XYZABC mechanic because the XYZABC machine is too rough too tough too heavy too demanding for the robots to handle."

Today: "Heeeyyy, why are the XYZABC machines sitting in landfills instead of on factory floors?"

Kid: "What's a factory floor?"


confused_vanilla t1_je2ker5 wrote

I'm also in the field and have noticed the same thing. The fact that all of my friends and family don't see the implications makes me feel like I'm going crazy or something. It may not be as fast as I think it will be, but I really don't see how it doesn't replace us very soon. I'm sure it will also be able to do the non-coding aspects just as easily as it does the coding.


IndependenceRound453 t1_je2n1ak wrote

Why does this subreddit seem to only attract people who believe we'll be out of work next Tuesday?

I frequent other TECH subreddits and other TECH forums/websites on the internet, but this is the only one that I visit where the overwhelming majority believe "AI-induced job apocalypse very soon". Those other communities that I am a part of have more balanced and grounded/realistic debates/takes about the future of work and AI.


ExtraFun4319 t1_je2nmcu wrote

>There is some serious cope going on in programming subs

There's cope going on this sub, too. "AGI 2023!" is clearly cope to me, cope that comes from people who desperately want AI to rescue them ASAP.

And the fact that no serious AI scientist (or any AI scientist) believes such a thing (AFAIK) only bolsters my view.


Crulefuture t1_je2ijzq wrote

I think it's rather optimistic to think we'll have the tech necessary to make all programmers obsolete in only three years. I think it's more likely that AI could largely wipe out junior/entry-level positions in that time though, and it sounds less crazy or far-fetched.


[deleted] OP t1_je2infr wrote



Crulefuture t1_je2isoe wrote

Of course it would, but it's not making programming altogether obsolete.


Derp_Derps123 t1_je5f591 wrote

It creates a bubble of seniority among engineers where there will be no juniors to usurp them when they retire/change careers. Once that bubble bursts, companies will be caught with their pants down if AI can't replace them either.


greatdrams23 t1_je2lsbg wrote

I understand perfectly well exponential growth. We've had it for the last 60 years, but it took 60 years to get this far.

Why did it take so long to get here when we had exponential growth 60 years ago?

Ans: because exponential growth still takes time!

Let's say we need another 1000000 times the computer power that we have now. How long will that take?


D_Ethan_Bones t1_je2nk1g wrote

>Why did it take so long to get here when we had exponential growth 60 years ago?

60 years ago there was TV and the cold war.

60 years prior the automobile was the latest greatest invention and radio had not reached the point of entertainment broadcast.

60 years further back, agriculture in the southern United States still involved chattel slavery.


SurroundSwimming3494 t1_je2mdyy wrote

Some of them are probably coping, but it's insane to assume they all are.

Just because someone has a different view than you does not automatically mean that they're coping.


Readityesterday2 t1_je2mpec wrote

Don’t underestimate the infallible human mind. It’s ability to lie to itself. To evaluate the world with distorted lens. To reflect with perpetual bias. To be obsessed with self-righteous than truth-discovery.

Our broken thinking will fuck us over more than ai could.

What’s funny is it doesn’t matter how eductad the mind is. Technologists to data scientists, I have seen lie to themselves and not blink at the recognition how faulty their thinking was.

It’s all a learning lesson for the few of us who cautiously guard our thinking apparatus. Gimme me a like if you are one of them.


whateverathrowaway00 t1_je2o8px wrote

There’s also a middle ground you’re missing.

People who think you might be right but also find it equally likely that there’s a middle ground and we’ll keep working until that happens.

It’s distinctly possible an AI will put me out of a job in which case I’ll sell this house, move into a shitty rental, and borrow from fam to go back to school. Probably trade school as I suspect that’s what I should’ve done years back instead of doing the CCNA->NetEng route.

That said, I think it’s equally likely that the situation is somewhere between the total doom on one side and the “this will change nothing” head in the sand on the other side.

Like, what do you propose we do?

Many of the devs you’re sneering at are probably secretly looking forward to it so they can finally walk into the woods and not hear that fucking slack wood knock sound anyways.


Bismar7 t1_je2oa5n wrote

People are surprisingly foolish about this subject.

AI will make us more efficient it won't replace us.

When it gets to a point where we can augment our minds with it, we will, synthesis is likely the pentacle moment.

In the meantime, people, programmers, will be able to do more in less time. Demand for digital goods will keep up with the design of them.


[deleted] OP t1_je2ucvl wrote



Bismar7 t1_je3hfvu wrote

Because the automation you see around you is still human inspired, it still caters to human wants and needs, and it requires human input to function.

The advent push of the envelope will be when we merge with AI mentally. When humans become AI. The strongest computer known for the last hundred thousand years has been the human brain.

You are confusing AI for humans today, AI requires the input we give it and even AGI will not want to seek elimination of people... Other people using AGI to do that will.

Have people stopped working just because economies of scale and mass production have increased the efficiency of tasks by 10000%? No, unemployment in many places is low. People are busier than ever...

When we multiply that and one person can produce in 10 years, what the entire world does, there will still be tasks people need to do, there will likely never be enough because there is always something more we can spend on time doing.


suicidemeteor t1_je2p2rw wrote

I'm a CS major, first year currently, and I'm of the opinion that programming will be one of the last major jobs to be fully automated.

This is for the simple fact that once an AI can code as proficiently as humans it will rapidly be able to iterate upon itself in an extreme manner that will functionally destroy all intellectual work.

I'm planning my life as though the singularity won't happen because for me it's frankly irrelevant. If it does happen then I'll sit back and watch the fireworks. I'll likely be out of a job, along with every other intellectual worker. While some workers might remain (particularly welders, mechanics, plumbers) I doubt those fields would be in any way recognizable.

Trades would be de-skilled to a frankly ridiculous degree. All it takes is a go-pro, an earbud, and a super intelligent AI (plus maybe a week or two of training) and you can turn just about anyone into a "good enough" tradesman. The gaps in knowledge, experience, and safety can be filled in by having a super intelligent manager looking over your shoulder. So in other words deciding to go into something like welding is irrelevant when those fields would be unrecognizable.


[deleted] OP t1_je2i2dw wrote



[deleted] OP t1_je2igh6 wrote



SkyeandJett t1_je2j250 wrote

Yeah I'm in an adjacent field (FPGA) and I'm not sure what they're talking about. The "non-code" parts are even EASIER for AI. For instance I just finished a requirements sprint. Half a dozen engineers over months for something that could probably have been done by GPT-4 with a DOORS plugin in an afternoon. We'd have a review to validate its work but that's still a MAJOR disruption in manpower need.

I think there's a big disconnect between how it works right now versus how it will work (or even can now if you set it up right). The next version will do it more or less how we do it. It's not a 0 shot approach. It'll write the code, compile, fix warnings and errors, and then write unit tests to validate the code and hand you the work package.


czk_21 t1_je2qr8w wrote

its always a breeze of fresh air to see developer who is not coping hard about how irreplacable he is


Illustrious_Wash8410 t1_je2jbly wrote

You guys have a real hard time to distinguish a programmer from a software engineer.