wastedtime32 OP t1_j8trcmc wrote

I don’t want a static world. But even at my young age I’ve become jaded and I know how this technology will be exploited. The vision of those who are creating it will not be how it turns out. The “problems” you refer to have a lot to do with modern technology. I’m not necessarily a decelerationist, but I don’t see how diving in even deeper is going to help us. I agree this is a reckoning point in human history, but I think we need to STOP going in the direction we have been and find a new one. AI is the next step on that road and I see nothing but trouble. It all seems so misguided to me. But then again, all this tech is simply the product of market competition. It’s designed for a certain task, it’s in its nature. I say fuck that nature, we need to embrace the real one. That doesn’t mean primitivism. It means we use tech to peruse human desires, but within an ethical framework compatible with the natural world.


wastedtime32 OP t1_j8ri4rq wrote

Idk dude. Seems like a lot of people on this sub (and subsequently, in the tech world, at the forefront of these technologies) look at AI as a means to completely optimize all human functions and reduce any sort of meaning in anything. Seems to me a lot of these people feel alienated in modern society, and the way they think it will get better is by taking all the things that alienate them and making those things even stronger. Like the way Musk and SBF say they’ll never read a book because reading is useless. The game of life has already lost its meaning to so many in the modern age, and people who can’t see WHY, and they wrongly think that accelerating into a even further alienating world is the answer. If it was up to the tech gurus, we’d all be neurologically implanted into the blockchain and never do anything other than optimize optimize optimize produce produce produce. There is a reason most people at the forefront of these technologies tend to be neurodivergent. This is all just a capitalist wet dream and soon enough all ethics and regulatory practices will be seen as enemies to the doctrines of “efficiency” and “optimization” and serve no purpose and will be ignored. People here love to paint me as some weird traditionalist religious conservative values person. But they are so unaware of how they worship the idols of optimization and utility and efficiency. These are things that should be the goals of systems that supply human needs yes, but they have their place. The idea that once we reach post scarcity those in charge will have ANY incentive to help us in any way is entirely insane. It’s “evolution”! Following the same biological tendencies, AI is giving those in power the means, and the reason to completely annihilate all others. And people here really think it will HELP us😂😂😂.


wastedtime32 OP t1_j8pajiu wrote

Yeah this is exactly my fears about AI, I just am not good at articulating them so everyone here thinks I’m just saying I want everything to stay the same. With the impending ecological collapse and recourse depletion, and the fall of globalization and inevitable rise of fascist adjacent chauvinistic isolationist hyper militarized states, this is about as bad of a backdrop to introduce AI, but then again I’m not sure there will ever be a “optimal” circumstance for it. But I do think that this will all culminate in either a massive revolution or dystopia. I just don’t see an in between. If capitalism prevails into the post-scarcity world we will be looking at the dystopia which many people here have confused for utopia.

Totalitarianism is soon and to me AI is a vessel for it. It is the lock to that door and it is in the hands of the ruling class already. There is a reason tranhumanist and bioengineering ideas are more prevelant amongst the elite (think WEF) because they know damn well most people will think of it for a means by which to accomplish equality and peace, but that is far from the case.

I guess from this post and the replies I’m learning that most AI have a hard on for a utopia and a ignorance for political/economic implications. These reactions are exactly what the big tech developers want. Complacency. Surrender.


wastedtime32 OP t1_j8p6c9b wrote

This to me really ignores all the other influences capitalist competition will have on this hypothetical world. AI will be better at us than art. AI generated art will sell better than human art. There will be little incentive to do anything other than consume. I doubt in this hypothetical world it would be made at all accessible to pursue things which might remind humans of the distinct human abilities and feelings in a world where our time will rather than freed up, be dedicated to consuming rather than producing. The ruling class will never let us have free time unless we are producing for them in some way.

Post-Scarcity Capitalism is a dystopia. There’s no way around it.


wastedtime32 OP t1_j8p3dn1 wrote

This scares me even more. A utopia is impossible. The ruling class will use AI as a tool to in face impose more suffering on the rest of us.

I don’t think a truly objective all knowing AI is possible because objectivity doesn’t truly exist, truth is a construct. It scares me that people will worship AI under the assumption that it has no biases, either one’s it developed on its own or imposed upon it by its creators.


wastedtime32 OP t1_j8p2ch8 wrote

I have a question for you. If I want to live on a farm and raise a family and work and make things for myself, and I’m not restricting anyone else’s ability to do whatever they want, should I be allowed? Or should I be forced to commit myself to this new utopian world? If it’s a utopia, shouldn’t everyone be able to do exactly what they want to do?


wastedtime32 OP t1_j8p0egt wrote

I understand what you’re saying. But I just don’t have faith in governing bodies to properly regulate it (bc of them being corrupted by the corporations who have a vested interest in dis-regulation) and I also know that in these unprecedented circumstances, there will be oversights and negative externalities that could likely be devastating.


wastedtime32 OP t1_j8ozs4d wrote

Thank you for incorporating a class analysis into the your prospective. Everyone here seems to have the assumption that the way the world is constructed currently is NOT warped to favor certain people, and everything is effected by that. Yes, I am scared of the idea of a “utopia” run by super intelligent computers. I’m even more scared of this technology being used as a means to further extract resources from people not apart of the ruling class. From the moment it was conceived, the world of tech was corrupted by the motivation to collect as much wealth as possible, which is in itself hierarchal and oppressive to most people. The idea that from this system can come a grand utopia which gives everybody all they desire is completely ignorant.


wastedtime32 OP t1_j8oz15b wrote

Fair assessment, but no I was not conditioned to value those things. Quite the opposite. I have grown to become a deontological thinker. To “think like a mountain” as Leopold put it. I see the interconnectedness of all things (scientifically not mystically) and have not been convinced (yet) that we as humans have the capacity to overwill the premise of nature. I like progress. But tactical, logical, and beneficiary progress. Financial incentive is at the very heart of the push for AI right now, there’s no way around it. I am not convinced that the desire for this particular future is not corrupted by the arbitrary notions of our current societal structures. The idea that this is natural progress comes from the assumption that progress as a product of market competition parallels the inevitable progress of species. I don’t think this is true. We have the capacity as humans to be self aware. This is a gift that could mean we collectively decide to moderate our progress for the benefit of all people.

I guess what I’m getting at is, as long as these innovations are coming from massive private tech firms, I don’t trust their motives. The idea that this system we’ve created perfectly distributes money to those who best abide by the natural forces of the world is silly to me. It’s a coping mechanism for people who want to see certain changes for a certain future, without acknowledging the world as it is today isn’t ready to morph into that future.


wastedtime32 OP t1_j8o0su7 wrote

I’ve always understood it would happen no matter what. What scares me is how fast and how sudden it is coming. And I also think; once we become aware of a trend guided by natural forces, doesn’t our awareness of it take precedent? But people have no interest in stopping it because we’ve created a world which rewards those who abide by those set rules, even though we know what they are and have the ability to consciously subvert them.