Viewing a single comment thread. View all comments

0913856742 t1_isyc0u2 wrote

I imagine part of the reason is that there has not yet been widespread catastrophic workforce disruption due to adoption of AI and related technologies, and part of it is that the advancements we were hoping for seem to be taking longer than we expected, e.g. self-driving vehicles. And so there is a perception that these technologies are still way off, or that they will affect only a few narrow industries, or it's all happening in the background and isn't flashy, therefor it's nothing to worry about.

Kinda like climate change - maybe we can understand CO2 emissions and ocean currents and so on in the abstract, but hey, it's still snowing where I live, and I got bills to pay, so whatever, nothing to worry about, the penguins can wait. And then all of a sudden each passing year becomes the hottest year on record for some reason.

I imagine it's quite like that - it would require some widespread workforce disruption where many, many people across various domains of labour lose their livelihoods and not be able to retrain before the mainstream realizes we need to work on solutions, but by then the damage will have already been done.

For example in my country Canada, one of the biggest grocery corporations Loblaws announced they'll be testing these autonomous trucks earlier this month, and it doesn't seem to be talked about much in our media.

62

Neurogence t1_isyub9h wrote

Ray Kurzweil (arguably one of the main proponents of the singularity) recently stated AI will create more jobs for humans.

People love capitalism too much to imagine a world without work.

32

RavenWolf1 OP t1_isyyan8 wrote

I was pretty shocked that he thought that way too.

18

RikerT_USS_Lolipop t1_isz7uwu wrote

He is very much a "don't tax the rich, just grow the pie instead" type. Any time someone asks him about growing wealth inequality he falls back to that. So if your conclusion is that wealth equalizing is bad, then you're going to work backwards and believe that systemic failures of Capitalism don't exist, and how can you support that idiotic notion? By believing technology isn't causing the game to be continuously and increasingly rigged against the little guy.

It's a human response. And humans are kinda shit.

13

BearStorms t1_iszgn9o wrote

Yep, me too. It is obviously wrong argument anyways IF you believe singularity will happen at some point. With superintelligence on tap humans will be just bunch of moody toddlers in comparison. Why would you let us do anything at all?

11

RavenWolf1 OP t1_it00blz wrote

I think he said that because he is working in Google. It seems like all tech people say the same thing. Maybe they are scared that people will start to blame tech giants if they say that AI will take jobs. That is only reason why I can think why all the tech people say this.

7

kmtrp t1_it03qv4 wrote

Yeah! I saw OpenAI's CEO say the same crap, something like "this will augment productivity, it'll be a companion to all developers...". Man, you are talking about a software that can code without a human! WTF?

So I am shocked and disappointed at the lack of honesty. The people working on these projects know that speech is full of shit, right?

3

BearStorms t1_it0ar10 wrote

>this will augment productivity, it'll be a companion to all developers...

Well, it will, only now you will need 1 dev instead of 100. Ask illustrators in like a year or 2...

FML, I thought software development will be the last job to go...I may be sooo wrong (I'm a dev).

4

kmtrp t1_it4khep wrote

Same thing for me, a former full-stack developer. Isn't it crazy? I mean paintings and drawings, and freaking programming? Especially given the state of front development? Incredible times.

1

BearStorms t1_it4m4wv wrote

Honestly, we'll see. The image generation is a problem where even a very imperfect result is perfectly acceptable. The coding is much harder problem, and then you have to remember all kinds of regulation, etc. But it's coming for everyone eventually. Ironically the physical blue collar trades working in a very heterogeneous environments like a plumber are probably the safest...

1

FomalhautCalliclea t1_it4psvw wrote

Especially since Sam Altman (OpenAI's CEO) has been quite open and outspokenly extremely optimistic on tech progress, talking about things like "free energy" (fusion) and AGI soon, more or less.

He also spoke about UBI and a need to radically change our economy. I wonder if he (and others) have multiple opinions and faces they show selectively in regard with context.

1

kmtrp t1_it4q6bf wrote

Most probably, it's an obvious CEO trait too.

2

FomalhautCalliclea t1_it4qpv2 wrote

I hope it's a "Charisma -100 / Perception +100" rather than "Charisma +100 / Perception -100" character trait.

1

BearStorms t1_it0aiu2 wrote

I think you are right and that makes it even scarier...

1

haptiK t1_isz4lqi wrote

> Ray Kurzweil

why does this guys website suck so badly?

5

iNstein t1_iszv3yc wrote

Ray is right about AGI/ASI and the singularity. Beyond that, I consider his work to be self serving and horribly wrong. Fortunately none if this relies on Ray so we will get our new world regardless of his poor mid term predictions and misguided ideas of the society that will result.

3

Sashinii t1_isz8ahd wrote

Well said. There's no way there'll still be any economic systems post-singularity.

2

Sotamiro t1_isz9j4a wrote

They will still exist... in my simulations

11

RavenWolf1 OP t1_it00lpm wrote

Exactly! Like in games. I love all those strategy games and city building games. Those have to have economies!

5

Bakoro t1_iszogjh wrote

Economics will exist as long as there are people. Scarcity will always be a thing, it's essentially a law of the universe.

There is only so much beachfront property, only so many houses with an ocean view, only so many people who can live on the top of a hill.
One way or another there will have to be a way to decide who gets what limited resources, and who gets the new things first.

Even if you just make everything timeshare, so everyone takes turns with exclusivity of a thing, some people won't care about one thing but will want more of their favorite thing. Some things will be more popular.
"I'll trade you my week in Maui for a day in the glorgotron" you'll say, and I'd be like dang, that's a good deal, the glorgotron gives me a headache anyway...

It's just a matter of what people value, what people want exclusive access to, and what is limited. If nothing else, people's time will always be somewhat valuable into the distant future.

4

RavenWolf1 OP t1_it01dni wrote

>Economics will exist as long as there are people. Scarcity will always be a thing, it's essentially a law of the universe.

Economy sure but not money necessary. Economy does not mean money. But I agree. As long as humans values something then we create value for it. In human society something is always valuable, like beauty or friends etc. Value is which causes us to have standing in society. We always have something which differentiates us from others. We give value for things which others don't have.

Sure we can have infinite energy and resources but there will always be something which creates hierarchy in our world. We live in society after all.

2

Bakoro t1_it1bxzb wrote

>Economy sure but not money necessary. Economy does not mean money.

Money is a useful abstraction for value. How many chickens to a television, and televisions to the beach house is a hard problem.

If you have resource tokens, its basically the same thing. The right to requisition x food resources and y labor resources, and z land resources. Anything fungible which replaces direct barter ends up being similar.

If humans are to still exist, they'll have to be part of the equation in terms of directing the AI. Like, who decides what the AI spends its discretionary time on? If the AI doesn't have its own motivation and interests, or otherwise just allocates resources to human requests, that can be a kind of money in and of itself. Start off giving everyone an equal share of AI requests, and the requests which generate the most positive feedback from the community yields more time to the person or group who made the request, and people can trade AI time share just like money.

I personally like the resource allocation model. It's basically money, only it ties value to quantifiable things. That's only viable when you have highly mechanized everything where the energy and time costs are highly predictable, like a society mostly by AI.

1