Viewing a single comment thread. View all comments

0913856742 t1_j0ccz7b wrote

Don't be obtuse. It doesn't need to rest on the 'mystical nature of AI' - it's already happening now.

What happened to all the factory workers in the American midwest when all those manufacturing jobs got automated and outsourced away? We saw a massive increase in drug overdoses and suicides. What should've been the correct action to take here? How about professional drivers killing themselves because they can't compete with Uber?

It is a very hard sell to tell someone who has been working a profession for decades to just learn to write code, or to retrain yourself to some other thing in order to stay market competitive. Job retraining programs aren't a guarantee either. How would you like it if you were on the cusp of retiring, but because your job was outsourced or eliminated by market forces, everyone just told you, suck it up buttercup, go learn how to do something else?

There are myriad examples like this that have happened and continue to happen without the need for an all-powerful AGI. All you need is market forces that seek to maximize profit and minimize cost. My stance is that the free market is a dehumanizing machine that always demands more, more, and more in order for the privilege of just existing.

1

j_dog99 t1_j0d8epv wrote

Maybe you are missing my point, although it's not a peachy one. I'll agree that MBI is an important stop-gap as automation takes over many jobs. But in the long run the most important adaptation will be a cap on procreation and a massive reduction in the human population. Market capitalism or Communism alike, the driving force behind human economy has been population. If we are to evolve beyond frogs, we will need less tadpoles

1

0913856742 t1_j0dc8j3 wrote

I do believe I am not understanding the argument you are making.

The original article discusses the problem AI poses in a capitalist system - namely, that if it gets good enough to eliminate jobs, people won't be able to make the money they need to survive.

My original comment was expanding on this point, that capitalism pushes us to view ourselves through our economic value first, instead of human beings with intrinsic value, and I argue that we should build systems - capitalistic or otherwise - that allow us to see ourselves as more than just economic inputs.

And so it is in this context why I am confused as to where your argument fits in? As I am re-reading your posts, I believe the point you are making is that an ever-increasing population creates the demand for people to sell their labour? And if we ever want to evolve beyond merely selling our labour to survive, we need to have less people, so there would be less market demand for production of goods and services? Something like this?

1

j_dog99 t1_j0dgfc9 wrote

My argument is very simply that your 'cultural shift' to embrace the value of the individual, is empty. The value of the individual derives from the useful work that they contribute to the collective. And the main function of the market-driven economy is to reward that value. With AI and automation, that value has been watered down, and the only way to restore balance is to decrease the population to a number appropriate to the demand of society. Having a bunch of people 'just living' and being paid to breed more of the same? Sounds like a bad dream, not a cultural improvement

1

0913856742 t1_j0dhfgw wrote

I believe I see the crux of our disagreement: you do not believe that human beings have inherent value. Would it be fair to say that if someone did not contribute to the collective - let's say, because they couldn't find gainful employment - then this person has no value?

If that is the case then I believe that we disagree fundamentally. It is my belief that all humans have intrinsic value, and it is up to us to build systems that allow all people to flourish no matter who they are or where they come from.

1