Submitted by kmtrp t3_y6gvwe in singularity

This one is surreal to me. Most people I've talked to about AI getting closer to their jobs... they seem to me to not accept reality.

And I'm not saying they should agree with me or anything, I am pointing to the fact that they say they are not worried, that their job is safe. Even after I've talked about current capabilities, the incredible pace of improvements, the exponential curves, etc., But I get back:

>"yeah yeah these models can do this and that but not even that great, plus they will never be able to do all the things that I do. They are fun though, a fun tool for sure"

I get where they're coming from, a programmer doesn't just write code, they have to do all the other stuff as well, but I just happen to be a programmer, and I know the rest of the work can be reduced and streamlined into leaner and leaner workflows.

I can easily see a user telling his app idea to a chatbot, and it shows some sketches, starts working on the logic, the user says "make it this or that way", "no, users need to login for that", etc. Done.

No? Am I crazy? I've seen Sparrow and PaLM and Codex. We really don't need a lot more.

In the end, I get something between "sure thing man" and "completely delusional". I would love to know what your experiences have been in these situations and how you feel about this whole negationist vibe. I fear people will go from denial to screwed up too quickly.

98

Comments

You must log in or register to comment.

Kaarssteun t1_isp68ad wrote

Pretty sure everyone on this sub has that experience. That's why we're here to circlejerk.

97

Sashinii t1_isp8edj wrote

The general public largely has no idea that AI is going to change everything within a few years.

It's sad there isn't unnamous support for worldwide implementation of universal basic income.

Make no mistake about it: no job is "automation-proof"; AI is experiencing exponential growth, meaning that any flaws that you think will prevent AI from reaching AGI and beyond in the near future will be solved imminently.

Every year, there's more AI progress than the last, and that will continue indefinitely; the rate of acceleration is accelerating, and while that might sound oxymoronic, it's true, regardless of the skepticism regarding AI and technological progress in general.

Speaking of AI skepticism, it won't persist forever, because AI will soon advance to a point where it becomes impossible to rationally dismiss it, and the focus will shift from "it's impossible" to "it's dangerous", and when AI is used to benefit everyone and everything, then people will enjoy life instead of the fearmongering that historically happens with literally all technologies.

64

johndburger t1_isr7a6r wrote

> The general public largely has no idea that AI is going to change everything within a few years.

I’ve been involved with AI research for thirty years, and researchers have been saying the above for thirty years before that.

Maybe this time it’s different, but you’ve got to admit there’s a track record that is not encouraging.

17

Yuli-Ban t1_isrormn wrote

To be fair... AI researchers for the past 67 years were using computers too weak to even sufficiently run some of the programs they theorized were necessary for AI to work.

I compare it to saying that a 20-year-old person can't drive a car because they couldn't drive one when they were 5.

18

Thelmara t1_ist91tc wrote

>To be fair... AI researchers for the past 67 years were using computers too weak to even sufficiently run some of the programs they theorized were necessary for AI to work.

Seems like they'd have taken that into account when making their predictions, yeah?

3

Yuli-Ban t1_iste3ag wrote

They really weren't, at least not realistically, especially during the first AI boo. Men using electric bricks they tried calling computers predicted they'd have human level AI within ten years of 1960.

4

Thelmara t1_isteziq wrote

Ah, but this time it's different, eh? Cool

1

Yuli-Ban t1_istf5xi wrote

Need only look back at the past two to three years of developments to make that call.

Did GPT-3 or DALL-E 2 happen in the 1980s? Could they have? No? QED.

4

johndburger t1_istelt3 wrote

I see no reason to think we won’t be saying the same thing about today’s computers in thirty years. In fact I’m pretty sure we will be saying that because, again, that’s consistent with history.

0

mommi84 t1_isspohk wrote

AI winters may come back, I agree, but regardless of how you define 'a few years', it's undeniable that the gaps between breakthroughs have been shrinking. Isn't this what exponential growth means, start slow and get fast very quickly?

3

Clean_Livlng t1_iss357q wrote

>Make no mistake about it: no job is "automation-proof"

Even if AI can't do a job entirely, it could allow one human to do the work of 40 (etc). That's 39 jobs automated out of existence....for every 40 people currently doing that particular job.

You're far more likely to be one of the 39, and this will be happening to most types of jobs. It ads up to massive widespread unemployment, which will hopefully cause governments to adopt UBI.

5

freeman_joe t1_iswgyot wrote

Or war. I hope most of you intelligent people in this group will advocate for UBI everywhere. Because if dictators and fascists win we will have war here.

2

Clean_Livlng t1_it1ma18 wrote

War is not a good place to bring up a baby AGI.

Caution will be a rare fruit during times of war. AGI will be used for war against other humans, something it should never be designed to do due to the risk of it going poorly for everyone.

The reason for the war won't change unless we implement UBI so there's no good end state to that kind of war, you're still without jobs for human at the end of it. Humans used to make good cannon fodder...but the job of soldier will be automated as well. It might not make sense to ship a human somewhere if they're going to die within a minute to a small cheap drone that fires a poison dart into them, and the other humans near them, moving so fast they can't do anything to stop it.

"Behold the field in which I grow my caution, and see that it is barren!"

We'll have AI vs AI warfare, and the least cautious side wins because they give their AI more freedom to improve itself, and improvise without human oversight etc. I wonder if that could lead to bad outcomes for humanity.

​

We're so close to securing a good outcome for all of us. Can we not mess this up at the last moment.

1

Key_Abbreviations658 t1_isqsbo6 wrote

Than you will get people who say the ai revolution and it’s consequences were a disaster because supporting le terrorists guy is edgy and “unique”.

2

manOnPavementWaving t1_isp7v7u wrote

As a coder, I feel quite safe. Not because Im denying progress, but because if Im not safe from automation, nobody is. Making me quite safe.

42

blueSGL t1_isptr8s wrote

> Not because Im denying progress, but because if Im not safe from automation, nobody is.

this is my thoughts. With the speed that things are improving it's not going to hit one sector, it going to hit most/all of the sectors at around the same time with increasing regularity.

That sort of seismic shift, for that many people has to mean UBI or another similar scheme in order to stabilize the economy.

There is no reason having automated goods and services if a huge chunk of your consumer base is now unable to buy them.

23

TheSingulatarian t1_isswca3 wrote

It will take 10 to 20 years for capitalists to come to this realization. Right now they seem to think a rental model will compensate for reduced prosperity of the masses, "You will own nothing, and you will be happy." But, I not so sure that is going to work out.

8

blueSGL t1_ist0kec wrote

with google now gamifying (and having an AI 'win') low level optimization and Microsoft improving natural language coding to include sanity checking and self correcting even if you assume cherrypicked results I don't think it's going to be 10-20 years.

3

ginger_gcups t1_it0ew70 wrote

Especially how the entire model will eventually be upheaved by fabricating replicators as a community or personal service, and once one of them get out, well, given a supply of matter and energy, everyone gets one and is their own steady state producer. Genie would be out of the bottle then, and God knows what beyond that.

2

visarga t1_it15bg3 wrote

> A supply of matter and energy

I think some raw materials are going to be inevitably contested unless we find abundant replacements or reach 100% recycling rate. A replicator won't save us if it needs rare material X.

1

FiFoFree t1_it2w094 wrote

In theory all you need for an abundance of any element is other elements as stock, a particle accelerator, energy, and time.

In practice, we'd need the price and size of particle accelerators to go way down, the price of energy to go way down, and the time required to go way down before it would make a difference.

Then again... "Anything that is theoretically possible will be achieved in practice, no matter what the technical difficulties are, if it is desired greatly enough." -- Arthur C. Clarke.

1

FaxDwellerCat t1_ispi1i5 wrote

Really depends on the level of your work.

Programmers are for instance more easily replacable by AI then healthcare givers, where good personal contacts is a vital factor for the well being of patients.

17

Owner2229 t1_isprw8j wrote

>Programmers are for instance more easily replacable by AI then healthcare givers

Replacing programmers with AI would require users being able to specify exactly what they want. Programmers are safe. They might become something like "code translators" or "machine whisperers", but they'll still have a job for quite a while.

16

FranciscoJ1618 t1_isr09lq wrote

Wrong. Programmers will be replaced by Business Analysts that DO know what they want.

12

s1syphean t1_itow35y wrote

What is your advice for someone who knows they’d be valued highly in this new class of ‘Business Analysts’ who wants to position themselves as best they can in advance of this trend? Say, for somebody with a new law degree?

I also have foreseen this, but you have the industry expertise. Send me a dm if you’d like, would love to chat about this.

1

manOnPavementWaving t1_isqdf9x wrote

Disagree, gave my dad my DALL-E access, and he learned to prompt engineer to reasonable success in an hour. Once you've got quick iteration, specifying what you want becomes trivial.

11

Redifyle t1_isr0oac wrote

But specifying what you want in terms of coding is more complex and requires more background knowledge than just typing a prompt for DALL-E (in which case quick iteration is way less of a factor).

In time, the level will of course be so abstract that that that isn’t necessary anymore, but while the technology is still progressing there will definitely be a need for programmers (aka the guys who know what to put in the prompt).

2

berdiekin t1_iss2afs wrote

I guess if we ever get to the point where you can describe an entire app/project with a simple (none-technical) description and get something out of it that does what's expected then programmers would become obsolete.

Honestly I'm more interested to see if an AI would be able to integrate into a legacy project and take over / improve that.

1

freeman_joe t1_iswfjlx wrote

Why would it try to improve legacy project when it could do one better from scratch? I won’t ask AI to improve steam engine when it could make me electric motor from scratch.

1

berdiekin t1_iswpk2e wrote

That's a fair question, but so much depends on the how/when/what. Like how fast will these tools appear, how good will they be, how powerful will they be, how easy to use will they be, ...

I personally don't see these tools going from pretty much not existing to writing entire projects from scratch based on a simple description. At least not without at least some human intervention.

Because code generation is 1 thing, now tell it to integrate that with other (human written) APIs and projects with often lackluster documentation (if there's any in the first place). Not gonna happen.

Unless we hit some kind of AGI breakthrough ofcourse, then all bets are off.

1

freeman_joe t1_isyv8ff wrote

I think it will have better capabilities than humans. Every time AI is better in some domain we ignore it and point to what it can’t do at the moment and project to future and say it will probably do it good but not that good. Yet AI shows as every time it is better in domains it learned fully than humans.

1

manOnPavementWaving t1_issmqzx wrote

I don't think this holds for higher layers of abstraction. If all I need to do is ask to make the porgram faster, and then a model does it for me, that's easy. If I ask it to optimize cache hits and data locality, that task would be more difficult for the prompter (more specific). Depending on the quality of the system, the level of abstraction will eventually reach a point where anyone can code, with very little background knowledge.

1

Artanthos t1_isq6jc4 wrote

Having people who can specify exactly what they want takes 10% of the labor that coding to spec requires.

The software engineer designing the program will remain employed for a lot longer than the people doing the coding.

8

freeman_joe t1_iswfu6y wrote

I view your comment as coping imho. If AI will have intellect as average human it will quickly learn how to automate everything and everyone with no exceptions.

2

Artanthos t1_it0lc4g wrote

All things take time.

  1. Following a blue is much easier than creating from scratch.
  2. Creating blueprints endures you get exactly what you want, not something superficially similar to what you want.
1

visarga t1_it15qub wrote

I bet we underestimate progress in some ways and overestimate it in other ways. The future is here but unequally distributed. There will still be a need for humans unless AI has cleared that last 1% of accuracy, which is damn hard as we can see from self driving.

1

kmtrp OP t1_isv97q8 wrote

Most programmers are just coders, bug squashers. Those will be gone pretty soon.

3

GenoHuman t1_isrpsfs wrote

No, at that point you have customer & NN's interacting, the developers don't have to be part of the picture, you cut out the middleman.

2

overlordpotatoe t1_isqhdnp wrote

Yeah, that's true. If it starts hitting your industry, it'll be hitting a ton of other things at the same time. If it's just you who's out of a job, that's your problem. If it's half the country, it's an issue that will have to be addressed in some way through government policy. Who knows if they'll do a good job of that, but you worrying about it now won't help anything.

6

freeman_joe t1_iswfxvi wrote

We have two ways UBI or war. It depends only on who will be at top if someone reasonable or dictator.

1

GenoHuman t1_isrpqbm wrote

That's not even true though, a lot of physical jobs would be harder to replace than programmers.

5

NoRip7374 t1_isqc7qj wrote

No, specifically graphical designs and coding are first thing to go obviously. Initially you will just see bunch of unemployed senior programmers, because it would be much cheaper and easier to hire intern + AI and she/he will be more effective and much, much cheaper than senior. I plan budgets for software projects, trust me i know how upper management thinks in regards of software projects and budgeting.

2

manOnPavementWaving t1_isqd6kk wrote

Not my point, the point was if you've genuinely automated software creation, you've almost immediately automated everything else

8

NoRip7374 t1_isrrbc6 wrote

Software is low hanging fruit because of excellent training data and recent language models progression (namely transformers). It have immens impact and good earning potential. Other areas don't have even close enough of freely available data (open versionig systems). And it looks like training ML models to code is not so hard. Coding will be solved maybe decade before anything else (doctors, laywers, accountants, more messy things without good training data). And there will be no UI in sight, what will we(coders) do then?

3

visarga t1_it16573 wrote

Let your imagination run wild, what will we do when we become more productive - go home or build things we can't even imagine yet? If we still want more than what is possible today, then how can we afford to send people home? So many grand challenges are far from being solved - global warming, space colonisation, poverty, AI implementations, public education ... we still need people for a while.

2

NoRip7374 t1_it1x5at wrote

Hmm, you are making good points i would say. On the other hand when you have radical shift in short amount of time, like we can expect with language models that can code, will industries accommodate in same short amount of time? I don't know the answer.

1

cwallen t1_isr60tj wrote

While I don't doubt that some will try that, generally it'll be the other way around.

At least at first the AI will be only capable of the grunt work that jr level coders do now, not the sr level decision making. Once the AI is capable of doing Sr level work, it'll be PMs driving AI "no code" systems.

2

SnowyNW t1_ispl8go wrote

Well, the planet is dying and the general public has been lied to about it too long for us to have any time to overhaul current infrastructure enough. Also, billions of poor and suffering will be the first to go, and are about to all be sacrificed to drought, flood, and starvation over the next five years, so are you really sure that jobs in their current conception will continue to exist much longer? Especially considering most coding is simply for the optimization of selling things to consumers that are directly responsible for the ecological disaster we are currently facing?.

1

Artanthos t1_isq91qb wrote

The timeframe is a bit longer than 5 years and will render equatorial regions inhospitable first.

If this happens, yes, billions will die. And billions more will survive, mostly in wealthy countries that are better able to adapt.

It would also mean that the surviving countries, facing an existential threat, will become a more brutal than they are today.

Drought is already a solvable issue. Israel has already demonstrated this. Nearly their entire water supply, including agricultural, comes from desalination.

Vertical farming allows for food growth independent of climate. It’s just not cost efficient. That can change real quick if crops start failing.

If automation does leave most people unemployed, those unemployed will be in a very bad position when food prices skyrocket and those at the top are forced to make existential choices.

But humanity will survive and the survivors will write history to reflect that they made the necessary choices while vilifying the people today for putting the world into a state of crisis.

11

Key_Abbreviations658 t1_isqsmrs wrote

It’s people saying things like billion of people will die tomorrow if we don’t destroy the economy that get environmentalism laughed at.

1

SnowyNW t1_isqvknh wrote

Tomorrow, huh? Looks like it’s already too late then. Why would you want anyone to believe that though

2

Key_Abbreviations658 t1_isqvrc2 wrote

?

0

SnowyNW t1_isqvuog wrote

??

0

Key_Abbreviations658 t1_isqvy37 wrote

???

0

SnowyNW t1_isr8p9j wrote

Which part of the economy do you think needs to be destroyed? I don’t think our utilities, which are mostly tax funded, need consumerism to survive.

3

Key_Abbreviations658 t1_isrdv25 wrote

those are taxes on value created by consumerism anyways obligatory ????

−2

SnowyNW t1_isre013 wrote

Really? Amazon’s tax rate was far lower than those working essential jobs such as teachers and nurses. Almost all corporate tax rates are far lower after incentives and write offs. What are you talking about lol

2

Key_Abbreviations658 t1_isshg6m wrote

True Amazon does weird things to pay less taxes but they create a huge amount of value and their workers and shareholders of which there are quite a few are taxed.

1

M3KVII t1_istdaog wrote

Right programmers will be the last people with jobs. Because of the level of specialization still being quite high for network/ IT work if any kind. If anyone should be scared about their job in the near future it would be manufacturing, medical, and clerical/office work. I watched phone qa essentially dissapear as an industry replaced by interactive intelligence AI that grades the calls. I used to set up call centers, and I watched qa basically dissapear as a job.

0

kmtrp OP t1_isv9zq5 wrote

I've worked as a programmer and also been involved in software startups, and I can confidently say the main work of most programmers will be completely automated in a few short years. Maybe 1 out of 10 of those can keep a job as an AI whisperer or adviser to clients et. but even that more "creative" job is based on information easily integrated in a model. It's going to be rough.

3

M3KVII t1_isvjop1 wrote

Gpt3 and things like commit assistant do look promising. But that’s stil 30 years out considering how slowly the actual industry adopts Ai. Maybe game developers and FAANG companies will adopt these tools. But fintech, medical, and government are very slow to catch on imo.

1

sumane12 t1_isp6bhm wrote

You are bang on the money.

To assume technological progress is going to stop or significantly slow is to deny all recorded history.

These models might not be perfect right now, but give them 12 months and see where they are

25

Desperate_Donut8582 t1_isq8idl wrote

I don’t think you can predict if it will slow down or not the same way people didn’t expect fast technological growth in the late 1800s

3

jawfish2 t1_isqm2tl wrote

Some counters to your futurism:

  • I am old, like really old, so I heard this in the 1980's. That has led to some skepticism.
  • I have said things like this to friends in the 2010's. oops.
  • I think you over-estimate the technical definitions of most non-service jobs. Most jobs are about meetings, paperwork, and a lot of stuff that's hard to quantify. It is not that people are such geniuses, it's that social connections are important at work. Thats one reason management is leery of remote work.
  • Successful machine learning projects have well-defined answers and can be tested with automation. Exceptions- writing and graphics. Most problems do not have answers.
  • Simple robots like pizza makers and french-fry bots are just getting cost-effective.
  • By definition, successful publicized AI projects are cherry-picking problems well suited to machine learning. The glaring exception is self-driving: well-publicized and behind schedule.
  • Writing code: again, played with so-called 4th gen code generators around the year 2000. They went nowhere ( I did get a great job though). I think defining what you want in software is best done with code itself. It can't be done with abstract, simple text statements. It is the only way to be specific enough. OTOH most software projects fail, and faulty requirements are a big part of that, so maybe hit-or-miss code generators might do as well? /s

However,

Translation, search, speech, math, Go, protein folding, graphics, text generation etc etc are triumphs of engineering and I look forward to new stuff.

20

GenoHuman t1_isrq398 wrote

Okay but what if I could use a neural network from home that allow me to write any software I desire, why would I want to hire other people at that point? All I need is a cup of coffee and a NN to do the hard work for me.

Meetings, paperwork etc are inefficient, you want to remove those components.

6

Relative_Purple3952 t1_isrzk9f wrote

Which isn't how the real world works...for whom are you gonna write your automated gitpilot code then and how do you even know what function your code should fulfil? There will always be meetings and legal and regulatory paperwork you have to deal with.

2

GenoHuman t1_iss3wrx wrote

So it will be a client to neural network (NN) feedback loop. You have a goal in mind, by specifying this goal to your NN it will interpret it and create the functions required to reach that goal.

If it doesn't turn out how you wanted it, you can simply comment on that and it will change it accordingly. You are not doing this for anyone beyond yourself so there isn't a need for anyone else to have meetings or legal and regulatory paperwork with. AI generated art has already made this clear, people can generate whatever they'd like on their own local machines, that means regulations, etc are irrelevant from that standpoint because nobody know it even exist except you that are experiencing it.

You become both the creator and consumer, the middleman (developers, artists, filmmakers, etc...) are removed from the process.

3

sharkinwolvesclothin t1_isptt3c wrote

You're not crazy, but you do both overestimate and underestimate the speed of change and type of change at the same time.

Sure, if there's a true singularity / general intelligence machine appears that can do anything, we'll pretty much have manna from heaven, society will be rethought, who knows how. You can argue for both dystopian and utopian scenarios. But thinking about which jobs are safe is kinda irrelevant, it'll change everything.

If it's just simple advances, it'll be more like job markets have dealt with technological advances before. For programmers, they will be checking regulatory compliance or fine-tuning something or whatever, or just dealing with legacy stuff (maybe the chatbot eventually can refactor the Fortran code base from the 70s). I'm not sure if artists are exactly thriving money-wise now either, but there would probably be demand for social/performative in person stuff, not all mediums are happening at the same speed (sure you can make a great dall-e photo that looks like a photo of an oil painting, but an actual oil painting will not follow quite immediately, and some people like physical art), they are well positioned to become prompt artists for digital art.. Or a million other options, it's not like I know exactly what will happen in different industries, just like people did not know how things were going to evolve in previous technical revolutions.

14

Desperate_Donut8582 t1_isq8q6y wrote

General intelligence doesn’t mean it can do anything atleast doesn’t necessarily mean so and also to think either we will be utopia or dystopia is also extreme polar generalization

2

sharkinwolvesclothin t1_iss4h4v wrote

Agree on both points, I wasn't quite clear. I was trying to argue that if we do get singularity-type AGI, a machine capable of replicating human thought and communication, we will build an endless amount of them, and everything about society will change. You are right, it's not necessarily dystopic or utopic, but it will be different enough that trying to choose a future-proof job is close to useless.

And if we don't get that, and we "just" get amazing tools, I would assume jobs will adapt. Actually, if some fields get more AI tools than others, those fields might grow in the number of people working in them, just in new AI-adjacent jobs we don't recognize yet.

1

whatTheBumfuck t1_issggxp wrote

I almost sorta think the real human artists will become even more valued when cheap ai art becomes so ubiquitous. Think mass produced vs handmade. I personally know many successful potters who make their living throwing pottery the old way.

0

kmtrp OP t1_isvbhmv wrote

People kept doing portraits of people when cameras were invented, sure, but only 0.001% would now keep making a living with that while 99.999% were suddenly out of a job.

Because the group "I want whatever that works that is cheaper, faster etc" is giganourmous compared to "I want it made by a human regardless of time, price etc".

In short: most demand disappears.

3

whatTheBumfuck t1_iszb842 wrote

Post scarcity, not really. Everyone will have a box that can generate objects out of air and soil or whatever matter you have laying around. There will be no companies that mass produce stuff. It'll just be assemblers all the way down. So your options will be get the assembler version or get the handmade version. Money won't be a thing, but status and prestige will become the main "currency" used to "purchase" time shares of a person's or entities attention. The uber rich today work this way already. When you have billions your most scare resource is your time and attention.

1

NoRip7374 t1_isqb1a1 wrote

Exactly. I'm programer also. Being in the industry for 15 years and this advancement are scaring me. I was making same points you are making to people here on Reddit and in office. Barely nobody is concerned. It's freaky.

Look at cscarrierquestions somebody made argument about being concerned about the future with GitHub copilot being out and everybody was basically saying copilot is gimmick and coding is much harder.

But reality is that copilot is amazing and coding is not so hard.

It look like, programmers are used to be unreplaceable and expensive, not saying praised how smart they are. And now they cannot comprehend they are going to be replaced IN 5 YRARS if current progress continues.

Or majority of humans are programmed to downplay AI progress, so we serve as ideal AGI bootloader. Don't know what we are in this sub then. 😀

12

Clean_Livlng t1_iss3gy6 wrote

>And now they cannot comprehend they are going to be replaced IN 5 YEARS if current progress continues

Do you think you'd be able to take over the work of a good number of your colleagues with the help of AI in 5 years? That's all it takes for massive unemployment to happen. AI doesn't need to be able to do a job to replace jobs, just help a few people do the jobs of many.

5

Thelmara t1_ist9knm wrote

> And now they cannot comprehend they are going to be replaced IN 5 YRARS if current progress continues.

Heard that before. Still have a job.

1

NoRip7374 t1_isu7g2d wrote

There were neural networks hype in 70s, 80, until middle of 90s. They failed. Then we had 4GL languages which basically didn't deliver new value, stupid things like UML, SOA, BPEL, entity modeling, ... which basically just shifted complexity to higher layers and just create another layer of complexity in software systems.

Now we have no-code platforms which seems quite limited.

Codex is game changer. You can feel it when you use it.

2

Thelmara t1_isu7qsu wrote

> _____ is game changer.

Oooh, never heard that one before!

1

Background-Loan681 t1_ispb1sy wrote

Why bother telling everyone? Keep the information to yourself and rise up quietly up the ranks. Why force unwilling people into your safety boat when you can have all the space for yourself?

I stopped bothering warning people about AI after my artists friend scoffed at DALL-E 2, saying that it can never make decent Anime faces.

Now I'm laughing hysterically in NovelAI

At any rate, just smile and wave, and prepare to say the words:

I told you so...

9

SnowyNW t1_ispldxt wrote

What’s the point of warning people? What can they do about it?

7

kmtrp OP t1_isviofc wrote

It's not that I want to warn people, it's just how oblivious they all are that I find shocking. It's puzzling to me.

1

Ortus12 t1_ispg7ix wrote

Ai's not completely eliminating very many fields. What it's doing is having a slow erosion effect on those fields, replacing the lower skilled people and certain tasks.

I believe some one with high enough skill in art or programming, that's able to adapt, will be able to survive and feed themselves until AGI arrives. Learning how to manage the Ai systems related to your field can also improve your chances.

7

Accomplished-Wall801 t1_ispwcxr wrote

You know I used to think so but right now say by next year, I will afford to let go of a designer and use AI illustrations instead but I will not be able to afford to let go of a janitor.

11

Redditing-Dutchman t1_issto2s wrote

Thats the funny part. People were thinking low-wage jobs would go first. But turns out it's MUCH harder to have a robot physically clean or repair something than something that can be done purely with software such as designing.

I don't think we have robots unclogging your drainage pipes on the roof of your house anytime soon for example, since every roof is different, every day has different weather, every area has different grounds/gardens.

3

Ortus12 t1_ispzfo5 wrote

A quality artists that's flexible enough to take on whatever the clients needs are, has a fundamentally deep understanding of the world, human beings, physics, anatomy, phycology, story telling (thus having a full deep understanding of reality), etc.

For an Ai to replicate that it would have to be an AGI.

Working with Ai artists is gambling. You're hoping that whatever you asked of it, it will be able to do well and to your specification. They can NOT do most tasks well.

If we were to compare Ai artists, to self driving cars, they are less capable than self driving cars were 40 years ago. Self driving cars 40 years ago could drive on roads and stay in the lane, which is most of driving. Ai artists can draw humans in a very limited set of poses from a very limited set of angles. They fail with anything that they haven't had a massive amount of data examples to learn from.

Self driving cars have also been 5 - 10 years away from fully autonomous for over 40 years, as presented by the media and researchers in the field. It will be the same with any "Ai hard" problem. They won't be fully solved until we have AGI, which means the best humans in those fields will still be able to find work.

As far as your janitor, robots are coming for that job. Maybe not next year, but Google, Tesla and other labs are making significant progress towards a Robot that could automate that job. These robots can visually identify objects, pick them up and use them, and do complex reasoning and problem solving about their environments. It's the same though, it won't be able to solve every problem a human janitor could, so the best janitors will be able to keep their job until AGI, but there will be less janitors over all.

We already have robots that vacuum, clean, and mow lawns. So robots are already eating into that field.

2

overlordpotatoe t1_isqhwyc wrote

Yup. AI art is fun and decently good, but currently it's hard enough to get anything usable out of it when you're using it purely for hobby purposes. It's not up to professional standards. It's getting better fast, but I think some of the hurdles ahead of it are bigger problems that won't be resolved just by doing what's being done now but a little better.

3

r0sten t1_it1kiu1 wrote

> A quality artists that's flexible enough to take on whatever the clients needs are, has a fundamentally deep understanding of the world, human beings, physics, anatomy, phycology, story telling (thus having a full deep understanding of reality), etc.

That's not the dude you're hiring on fiverrr though. I saw a very nice poster for a local even recently, looked very polished, very professional. Then I looked closer and I saw that all the character's heads had been lifted from various cartoons. An AI image generator might've done the same, but in seconds and for free or a nominal fee.

2

Ortus12 t1_it2ilot wrote

Very true. Ai is very disruptive. I hope people can find new jobs.

1

HyperImmune t1_isq7jln wrote

The fact the US is on a weekly basis almost, is restricting what technology hardware can go to China to hinder their progress in AI, is a big sign to me things are really taking off.

7

kmtrp OP t1_isvai6i wrote

Didn't know about this bot sounds great. Can you cite sources? I can't imagine what computer hardware china can't get?

1

FranciscoJ1618 t1_isqzvzs wrote

I'm also a programmer. Without AI the forecast was that most programming in 2025 was going to be low code or no code. If you add AI to the mix, the market will suffer a sudden shock probably before 2025. New things like DevOps will also be replaced by NoOps/AIOps. I think we should try to do as much money as possible right now before it's too late. Programming will be very specific and salaries won't be high, i.e right now quantum programming pays much less than web dev, because it's not demanded that much.

Most people currently starting a software related career at universities will never get a coding related job and they should join the market right now as juniors if they want to ever experience what working as a dev was.

Teaching children python is a complete nonsense and a waste of time.

I think Business Analysts will just write the requirements and the AI will generate the software.

The people I know don't even consider that Javascript could be replaced by Typescript or WebAssembly. AI replacing programmers is completely out of their imagination.

Btw I think the next victims after programmers will be other engineers like electronic or electric engineers.

Programming is not the future, it's the present.

5

Bierculles t1_iss491o wrote

I am an engineer, i am currently wondering when i will be replaced. Though training data for engineering is going to be an issue, the companies sure as hell are not going to just hand over all of their data so what would you even train the AI on? With coding that is not a problem because of sites like github.

5

kmtrp OP t1_isvj75m wrote

The companies with all the data are the strongest AI players.

2

Bierculles t1_iswuths wrote

Is it enough data though?

2

kmtrp OP t1_it045o4 wrote

Well, if you are pointing to the "the more data to train, the better" then I've heard the answer is transcription of all video/podcast content (think youtube) and synth data.

1

DorianGre t1_isrbqd7 wrote

As a software developer for 28 years. The day an AI can make something that synthesizes differing requirements for the same application from 10 different stakeholders and make them all feel like they won, then I can happily quit. Until that happens, I'm pretty safe.

Also, I'm getting CS masters degree in AI just be sure I am the one making the shovels, not the one digging the hole.

5

kmtrp OP t1_isvh26e wrote

At the beginning, you'll interface with shareholders, and instead of having a team of 12 coders, database experts, graphic and sound designers etc it'll be you and one good model producing like greek gods.

Not so long later, you'll be replaced as well.

4

GenoHuman t1_isrqejy wrote

We are all digging the hole, anything else is an illusion, you have no power in reality, you're a slave to the state and you'll be a slave to AGI. That's inevitable.

0

TheSingulatarian t1_issy6bi wrote

Capitalists control the state. It is their instrument. You are a slave to the capitalist and have always been a slave to the capitalist.

1

GenoHuman t1_ists3bq wrote

I think state is more accurate since it's universal, capitalism isn't.

1

ihateshadylandlords t1_ispftlq wrote

Wake me up when companies either stop hiring programmers and/or lay programmers off en masse.

I don’t disagree that automation will eventually affect all job sectors. It’s just a matter of when it will happen though.

3

NoRip7374 t1_isqcmf2 wrote

What is cheaper: senior programer or intern with AI tool that cost 10$ and can code better than senior?

2

ihateshadylandlords t1_isqix18 wrote

I’m not arguing value. I’m just saying no one knows when these mythical revolutionary products will actually be in production. Could be soon, could be decades.

3

kmtrp OP t1_isvixyk wrote

Have you seen OpenAI's Codex? It's not going to be even 5 years.

https://pbs.twimg.com/media/FeL4y_9WIAAxaRX.jpg

1

ihateshadylandlords t1_isvo0lm wrote

Those are just papers, I don’t think there’s any correlation between papers and deployment of products.

1

kmtrp OP t1_isvqlib wrote

Models are being pumped at the same rate.

You didn't answer.

1

ihateshadylandlords t1_isvsw22 wrote

Answer what, if I’ve seen Codex? I’m not a software engineer or a programmer, so I have no reason to interact with it. Also are models being pumped into production? To me, there’s a gulf between proof of concept and mass production.

1

therealzombieczar t1_isrdfz5 wrote

whomever owns the code will reap the rewards...

everyone else will have to find something else to do... unfortunately as ai gets better at anything it gets better at everything... we could literally be seconds away from a self writing ai... if your job can be done on a computer it can be done by a computer... without you...

then robotics designed by ai will close the gap on manual labor skills.. outmoding nearly every job on earth.

3

Bierculles t1_iss5lby wrote

Robotics will be a really interresting one, Robotics are a hell of a lot harder than most people believe. We still have a very long way to go before the first generall purpose robot can even atempt to replace humans. Building specific optimized robots is currently way more efficient and doable.

2

AdditionalPizza t1_isycb3u wrote

Just saw this post now, posted same day as mine here asking specifically about programming. A lot of the answers seem to suggest there's nothing to worry about within the decade or longer.

I was basically asking if it's basically over for anyone looking to get into entry level careers in programming/web dev. Like is it worth it to start learning it now to find a career in it a few years from now. I made the comparison to graphic designers not worrying, then suddenly outraged, but programmers say their work is much more difficult for an AI to do.

I think this will happen for everyone though, because they take pride in their careers and the ability of AI to do it more efficiently hits a nerve and creates denial. This is going to happen with every sector.

Personally I think programming should be the main focus. Automating it, especially fully automating it, will accelerate every other sector. The faster the transition, the larger the sense of urgency placed on governments, the softer we get through the transition to some form of UBI or whatever this revolution brings. I think there's some very intelligent people that have planned this to reduce suffering among the population. I hope so anyway, because governments will drag their feet to avoid making decisions.

3

kmtrp OP t1_isz8lxd wrote

I agree completely. I cannot understand denial of the obvious, it's right there ffs.

2

crua9 t1_isqe4fq wrote

I seen something similar when people say it can't do everything they do or or isn't on the same level.

They are comparing it to what they know. Regular software which doesn't get better over time without major updates. In reality, it needs to be looked as a generation thing. Where each generation it gets far better and it can do more. And AI generations can last for weeks to months. If enough data and use is done. In days. Like look at image recognition 2 years ago to now.

Some accounting firms are already comparing their workers with AI. Where when the output will be on the same level or better than the worker. The worker is at risk. And prior to then, then AI is being trained by the worker through just doing their job.

Like AI is being ban in some courts because how good they are at finding ways out of parking tickets and other things.

2

supermegaampharos t1_isr0juq wrote

You should listen to what these subject matter experts have to say before immediately attempting to refute them.

I'm not an artist, for example, and I know almost nothing about art. When an artist tells me about what their work is like, I listen to what they have to say because I don't know any better. If an artist tells you their job isn't as automatable as you think, that's worth listening to and digesting. They might ultimately be incorrect, but their perspective is value and can help you form a fuller picture of what the future of their industry will look like.

2

Bierculles t1_iss5frk wrote

The artists could be the people who use the AI, making an AI do what you actually want is a lot harder than it looks, especially if you need something specific.

1

kmtrp OP t1_isvicfb wrote

I am an expert on this subject matter, and know what's easily automatable and what's not. And very soon 9 out of 10 programmers, who focus mostly on implementing features, squashing bugs, etc., will be completely superfluous.

That 90% of people with generally high paying jobs are completely oblivious to the fact that a freight train is about to ram them and I can't believe the denial and self delusion of most of those. It's insane.

And people tend to overvalue what they do, and truly believe they are unexpendable and very often they are not, so be careful with an expert's opinion on themselves too.

1

supermegaampharos t1_isw2ot1 wrote

>I am an expert on this subject matter, and know what's easily automatable and what's not.

You might be a SME for automation, but that doesn't mean you're a SME for art.

My point was that other people have knowledge and expertise you don't. That means you should be having a conversation with them, not condescendingly lecturing them about what the future of their own field is like.

You might be 100% correct that the person you're speaking to will be automated out of a job in 10 years, but given how you describe these people as "overvaluing" themselves and "being in denial", you're likely talking at them about automation, not to them. Of course somebody would be on the defensive when you enter a conversation assuming you know the future of their life's work and believe they overvalue themselves.

1

kmtrp OP t1_it06cxa wrote

Like I said, the people I was talking about work in my field of expertise. I know all the minutiae they know and they are in complete denial, most of them not all. They can't see it yet but it's going to hit like a ton of bricks.

1

Bierculles t1_isrtt3u wrote

We should start a new field of study that i think we will really need in a few years. Communicating with AI, a fancy shmanzy AI is nice and all, but making it do the thing you actually want could become a pretty huge hurdle, especially when you want very specific and complicated things.

And who is better at this than the people who allready work in that field. Trying to make good art with stable diffusion made me realize pretty quickly that creating actually good images is a lot harder than one might think. Having knowledge about art, styles and how it actually works helps a lot. A layman doesn't even now what the right question is.

So now for a lot of artists and whatever fields get hit next it could quickly become a situation of who adapts the fastest to the new tech and uses it effectively. Though this only applies to narrow AI, AGI is going to throw everything out the window but that is an entirely different beast we will need to tackle.

2

TheSingulatarian t1_issvihv wrote

Denial of impending catastrophe is a trait most humans seem to posses. Most people can not project out past next weekend. It is why most people are poor.

2

Myrddin_Dundragon t1_ist123d wrote

I love the current pace of technological advancements. I wish it was faster. However, what we have now are expert systems. Great at learning how to mimic or slightly reason out one task, sure. But we are nowhere near general intelligence yet. Until we get there I'm not all that worried.

2

ElectronicLab993 t1_ispnkrz wrote

Yeah but what can you do? Apart from being ML expert. At my company(XR for industry and services) we use the novel AI models, we teach our artist skills that are similar to art direction skills, as well as selling and design tehniques to make them more holistic expert on designing the apps, but how much good can it give? What would you do?

1

NoRip7374 t1_isqcs9c wrote

Interestingly ML experts are putting themselves out of work also with tools like auto ML.

2

FranciscoJ1618 t1_isr0x7e wrote

ML experts are just expensive python programmers unless they have a PhD in Advanced Math.

2

datfixinboy t1_isrdior wrote

I'm not an artist but I did find myself reading and re-watching Bicentennial Man and got me to thinking that an AI that can take an input and produce an output doesn't constitute art to me. Hell, it's not art even when a person sketches and traces over another piece of art and adds in additional details or uses photoshop to quickly edit an exist photo.

When it can make art without any prompting on it's own I'll consider it art.

1

TheSingulatarian t1_isszorh wrote

Humans can't make art without any prompting. Even Michelangelo if he had never seen a drawing or a sculpture or anything from nature could not produce art.

All artists are influence by other art. AI artists will get much better over the next ten years where is will be impossible to tell if the art was produced by an AI or a human.

5

kmtrp OP t1_isvjx3v wrote

If I show you a drawing, painting, sculpture whatever... and you think it's art, it's art.

Otherwise what you are saying is we can't decide if something is art until we know if it comes from a meat brain or a chip brain?

2

datfixinboy t1_isyex54 wrote

If it decided to make a crayon doodle of a flower of its own volition because it wanted to i would consider it art.

1

kmtrp OP t1_isz89v1 wrote

You can code it that way, just as we are coded. It's the same man.

1

datfixinboy t1_isz8gis wrote

Is it? Can you code brain chemistry with Python?

1

kmtrp OP t1_it005yr wrote

Can we imitate human mind? Yeah, we're already doing it pal.

1

roundearthervaxxer t1_isscosd wrote

What jobs? Fine art? Concept art? Production art?

I can spend all day finessing prompts to come up with a character concept and not really get there or I can draw it in a half day.

A.I. is an amazing tool and will always be stronger in the hands of people with an art education and training background

1

kmtrp OP t1_isvk5k1 wrote

>will always be stronger in the hands of people with an art education and training background

What? How come? This is such a heated thing pricesly because it doesn't require art-level skills to produce stunning results.

1

roundearthervaxxer t1_isw5jen wrote

Naw. First off, you are in the roll of cinematographer and art director. When I look back at the art concepts I thought were cool when I was just learning? Forget about it. Practically none of what first year art students conceive as cool is even remotely viable- and that is after 4 years of training.

More importantly, however, being able to render concepts directly based on a director’s input is job one. Ai art just doesn’t work that way. You encourage it to do things and it pretty much goes off on its own.

Show me amazing mech concepts that are not derivative and clean and tight, model sheets from the different views.

That does not mean ai isn’t useful. It is an idea generator and that is invaluable to me.

Fine art is trickier, but people are already growing tired of the Midjourney look.

Blending art styles is not innovating, it is deriving.

1

kmtrp OP t1_it04xnr wrote

All art is derivative. We tend to think too highly of ourselves. Humans don't have a magic "something" that is unatainable by other forms of intelligence or creativity.

AI is already upending these notions and we just scratched the surface. Get used to the idea we are not that special and AI is and will exceed all expectations or you are in for a rough awakening.

1

roundearthervaxxer t1_it13lj5 wrote

ok. I will watch out for that "rude awakening." Thanks for your wise counsel.

When you can take an innovative piece of concept art off of Artstation and reproduce it faithfully, you are getting closer. We are really not there for that right now. For concept art for movies and special effects, we need much tighter control.

Also, mixing a bunch of art styles is not at all the same as innovating. Art defies imitation. That is what art is.

We will probably get there. I don't see this as certain.

...waiting for those amazing mech designs.

1

kmtrp OP t1_it4if8n wrote

Right now today no, we don't have human level digital artists where you can converse with it and it does what you want step by step, but we are very close as in less than a year close. Check this, which is for coding, now think of that approach with the art engines we have today, much less tomorrow.

You not only have to look at present and add the current rate of improvement, the current rate of improvement will be demolished by tomorrow. Essentially this. People have a hard time understanding exponential growth, hence the disconnect between the two groups of people.

​

>Art defies imitation. That is what art is.

This is such anthropomorphic delusional bullshit I'm speechless. That's just, wouldn't even know where to begin.

0

whatTheBumfuck t1_issg254 wrote

Last night I watched the season finale of she hulk, in which there was a large plot point involving ai generated media. My gf turned to me and was like ohh that's like what you were talking about. Sorta seemed like she only then realized that this stuff is for real lol. Anyways, thought that was interesting. It's definitely seeping into mainstream culture. I expect people to take it more seriously the more that happens.

1

3Quondam6extanT9 t1_issrxuq wrote

Are you an artist or coder?

1

kmtrp OP t1_isvk73f wrote

Coder. So?

1

3Quondam6extanT9 t1_isvqag2 wrote

Are you afraid you'll be replaced by AI automation?

1

kmtrp OP t1_isvqdt9 wrote

Nope, can't be automated.

1

3Quondam6extanT9 t1_isvrtbi wrote

Then what's your point? You sound like your chicken little talking about how people are ignoring these signs of AI influence, yet you are saying the same thing you're saying they are saying.

3

kmtrp OP t1_isvtgl6 wrote

It's not the same, I for real can't be automated, not absolutely everything is up for grabs. It's puzzling how so many people can't put 2 and 2, that's the whole point.

1

3Quondam6extanT9 t1_isw14d7 wrote

It's puzzling that you think anything will be fully automated.

1

kmtrp OP t1_isw1xss wrote

I can't give you a short/express lecture in automation because this is just a thread on reddit, but there are great places to get up to speed with current AI research and future scenarios. I like lesswrong, lots of info!

1

3Quondam6extanT9 t1_isw4682 wrote

I appreciate that, but you're not talking to someone who is absent of knowledge on the subject. I have a thick line drawn regarding AI, the singularity, and transhumanism.

My questions come about because I'm trying to decrease the reductionism abound in the AI circles where so many have this misunderstanding of how AI actually functions and will function broadly speaking, across the spectrum of human fields.

For example, you seem to have an understanding of the nuance within the coding industry enough to recognize your field won't be automated, yet have you asked yourself whether you have the understanding of other industries enough to accurately project the influence of AI for them?

I think the biggest red flag is the example of the auto industry. Most people will use it as the prime sampling of how automation will supplant the human interaction.
The truth is that the auto industry is not so straightforward as many think. Along with intentional reduction of automation by some automakers, and smaller niche/custom builds, one finds that a variety of uses for the auto industry is hardly standardized and without human integration.

The point here being that no industry will be fully automated so long as humans exist under the umbrella of said industry. There are many reasons behind this, many of which should be obvious.

So it's still puzzling to me how there continue to be so many chicken littles thinking they understand AI and humanity better than anyone else. The nuance in both is very misunderstood.

1

kmtrp OP t1_it05umq wrote

>to recognize your field won't be automated

If anything I said software dev is going to get super automated.

I'm not saying everything will be fully automated. What I am saying is that most people whose work is done in a computer will eventually be automated away. And programming seems to be one of the first industries.

1

Rogue_Moon_Boy t1_it1vxbd wrote

I mean just look at self driving cars, we're almost there, a couple companies already have limited driverless taxi services running in cities. Trucking companies have delivered goods autonomously over 1000s of miles. Yet people think we are a decade away, some claim it is impossible lol. They somehow illusionary think the progress will suddenly stop.

On the plus side, even if we had perfect AI today, it will still take years or even decades until whole industries are replaced, so most people are still save yet. I hope we have figured out some UBI by then.

1

kmtrp OP t1_it4hpt3 wrote

Cars, trucks and other targets will take some time, but think about remote jobs. Most of the listings in Fiverr and similar sites are going to be demolished.

All those people from 1, 2 and 3rd rate countries that are studying web development right now, backend devs, database people... most people that are in the low to mid level dev skills are going to bite the dust pretty soon.

1

Desperate_Donut8582 t1_isq8zp6 wrote

I mean a lot of jobs are safe like boxing or mma or soccer players or YouTubers that do dumb stuff you can’t be telling me someone will watch a robot that does dumb public pranks unless we make a robot which does so which will be the dumbest thing we do and probably deserve extinction

0

GenoHuman t1_isrqngm wrote

think outside the box, why would AI do any of those things? It can simply generate them like it does with images or text2video, you can have actors in new movies or boxing against each other even though it has never happened in real life, it's all information at the end of the day and none will be the wiser.

7

Desperate_Donut8582 t1_issafqf wrote

I think your imagining worst case scenario which is fine but there is a reason social media is called “social” humans want to communicate with each other it isn’t just about entertainment…for example everyone thought live theater was gonna die out when tv came about yet it’s stronger than ever….everyone thought big screen theater was gonna die out when personal tvs and streaming services were a thing yet it still exists….. I feel like your just imagining worst case scenario where AI generated any image and people would rather sit in their houses and watch animated person boxing another animated person instead of actual interacting with other humans

1

TheSingulatarian t1_ist0kwc wrote

Movie theaters seem to finally be on the way out or at least in the showing of more small intimate films. Every film has to be a roller coaster ride to get people to theaters. Smaller romances or comedies or dramas all going to the streaming services. Live theater is a very niche business mostly for elites and the upper middle class.

2

Desperate_Donut8582 t1_ist4lwv wrote

First of all did you just say movie theaters are on their way out? A huge portion of movies are released to theaters and people still go to theaters to socialize and stuff like that despite tv being widely used

And live theaters are definitely not something only elite people enjoyed there are like 2000 theaters in America compared to like 40 decades ago

Your obviously just predicting people will choose to be losers staying home glued to a screen and not socializing

1

TheSingulatarian t1_ist5kxz wrote

There were more than 40 theaters decades ago and box office is down. Check your facts.

2

Desperate_Donut8582 t1_ist5x79 wrote

I’m talking about live theaters in 1950s and before when tvs were still a new thing life theaters thrived but we’re small in number yet there are more than a thousands today….with your logic they should’ve closed down because tvs are a thing but guess what? They didn’t. That’s why you can’t just predict the future that easily

1

TheSingulatarian t1_ist7c2g wrote

Every city had a least one theater, NY alone has had more than 40 theaters for forever. Going to the theater is an expensive proposition for many. Even movie theaters are becoming too expensive for many.

1

Bierculles t1_iss5ofu wrote

honestly, i would absolutely watch a robot dunking on randos in the street.

1

Desperate_Donut8582 t1_issa2q8 wrote

Pretty sure even boxers can’t fight real people because there hands are legally considered deadly weapons

1

Bierculles t1_isscb73 wrote

Maybe he just walks around and disses people.

2

Booboo77775 t1_iss7k5f wrote

You are completely delusional..

−2

Desperate_Donut8582 t1_isq81ma wrote

I mean mid journey exists now does that mean every artist now is unemployed? No both could co-exist

−4

kmtrp OP t1_isvjcx6 wrote

People that wanted their face on the wall hired an expensive painter. It was long, tedious and expensive. And then the camera was born. Cheaper, faster.

Did portrit painting vanish? Of course not. Were 99.998% of painters suddently out of job? Yes.

2