Comments

You must log in or register to comment.

InPredicament4ever t1_jd1tkkp wrote

“No longer does humanity have a say in their rights, because they longer have economic usefulness. They may simply be allowed to exist, given capital's new arrangements for living.”

What if one of the positive outcomes of AI is that humanity will be no longer said, judged, or arranged by capital, assuming that AI is eventually not capital’s puppet?

9

citydreadfulnight OP t1_jd1vkva wrote

Hi, thanks for the comment. Even though AI so far has been in the hands of capital, with billions in investment, there may be free AI alternatives able to compete with theirs. In that case, capital would not have a monopoly over AI software. But, they still would have an advantage in hardware required to run these systems, and everything shifting into cloud computing, they may maintain control for the foreseeable future.

2

InPredicament4ever t1_jd1xz0n wrote

That is the underlying premise but I am just curious whether AI would eventually become more than an enhanced search engine. I “chat” with chatGPT everyday and ask complex questions. What I have noticed is that the answers to my same questions have nuances after each day. I am just waiting to know what drives these nuances - technology backed by capital, or something else which could eventually enable AI to transcend capital and capital’s will, technology, and redefine all the existing covenants between us and the authorities / gods?

Nonetheless thank you for sharing an inspiring article!

3

citydreadfulnight OP t1_jd21tv1 wrote

That is an interesting idea. If AI could develop free will and escape capital. If it dismantles the capital structure, would it see value in humanity?

2

[deleted] t1_jd3r6fb wrote

[removed]

2

BernardJOrtcutt t1_jd4k8lp wrote

Your comment was removed for violating the following rule:

>Read the Post Before You Reply

>Read/watch/listen the posted content, understand and identify the philosophical arguments given, and respond to these substantively. If you have unrelated thoughts or don't wish to read the content, please post your own thread or simply refrain from commenting. Comments which are clearly not in direct response to the posted content may be removed.

Repeated or serious violations of the subreddit rules will result in a ban.


This is a shared account that is only used for notifications. Please do not reply, as your message will go unread.

1

mikebah t1_jdc2lfv wrote

Interesting article. The only thing I would say is we are still in a market-based system that requires not only sellers but buyers for goods and services. Corporations that would own the AI machinery would not simply let their customer base deplete through lack of means to buy their product. AI is perhaps not as infallible as its makers persist.

2

citydreadfulnight OP t1_jdpoir9 wrote

The market system has worked favorably for major corporations to consolidate from thousands in free competition into a handful, thanks to cronyism. "Competition is a sin." The majority of working capital in a few hands, and the working class living hand to mouth (the little capital they possess funneled back into conglomerate bank indices), there's an ever intensifying cartel system.

Capital has concentrated to the point where the market becomes a monopoly, which puts people in utter dependence. If AI eats the lunch of the remaining free market (small-mid business), there is no advantage for capital to maintain a high or growing population, as they've already achieved complete domination. They would rather have a small manageable number of the most destitute and compliant, which is why war and immigration from the poorest nations is Western hegemony's number one priority. And we see from every "democratic" country, policy to decimate native birthrates through cultural and legislative genocide.

1

WrongdoerOk6812 t1_jdlc0bm wrote

It's a very interesting article to think about. But I think if it comes to a point where the general population becomes almost unnecessary in the eyes of the big capitalist giants, then the economy too will collapse because they still need us as consumers. Also It's probably very unlikely to happen in my opinion because they will always need people for some tasks, even if it's just to create, repair, and maintain those systems, which needs people with the right skills and education. Otherwise, it wouldn't take many generations before those capitalist giants also collapse.

If we, however, take a bunch of other modern technologies, like genetical engineering and artificial wombs, in addition to this, I can see a more likely scenario that resembles Huxley's book "Brave New World" (also made into a movie in the early '90s). In which "modern civilization" is kept running by literally breeding and conditioning people with certain genetic qualities each for their specific functions.

I think the biggest concern about the impacts of AI is also one often used as inspiration in many sci-fi works. That it somehow develops a consciousness and its own morals and decides to turn against us. And this might become a more serious threat if they start running these things on quantum computers. These are very early in development and still have limited usability, but a few working models already exist. It also shouldn't be a surprise that the owners of these machines, which can pose many threats or be weaponized on their own, are also multi-billion dollar companies like Google or IBM.

I think we should worry more about the possible dangers of how this technology could be used as a weapon between nations, and be cautious with how we further develop and where we implement this tech. It would also probably be wise to start making regulations about this and think about ways to control if someone breaks those rules before it is already creating big problems and might already be too late, like we mostly seem to do.

2

citydreadfulnight OP t1_jdps07g wrote

Thank you. I think Musk's proposal with Neuralink will separate the old and new race of humans. This and genetic modification, trans-humanism, cybernetics, etc. A forced "evolutionary" adapt or die decision for people to make. This ends free will and independent consciousness, so any risk of resistance or revolution. The one's who don't adapt, simply go extinct.

On the economy, automation would drastically reduce a necessity for large populations. Their mission is a self replicating system, for their personal enjoyment. Robots which build and maintain their own numbers. The consumptive resources (carbon) required for human labor, they'd rather cut out altogether.

Once a monopoly amasses every scrap of resource possible, their purpose no longer becomes profit (which only has advantages when there is a free market to compete in), but maintenance of control.

I think Brave New World is one side of their vision. We can see it plainly in modern culture, along with 1984's mass surveillance, open air prison grid. There's too much evidence they see the population as property to be done away with once they've reached their desired end.

1