CrelbowMannschaft

CrelbowMannschaft t1_jec7htn wrote

It's a reasonable correlation to observe. AI gets better, tech jobs go away. There's a reasonable understanding of how that process works. If there's some other reason, that should be at least as reasonably explained. No one has explained any other reason, other than "business cycles," which is vague and imprecise enough to be meaningless without further information and support.

0

CrelbowMannschaft t1_je92ccw wrote

>transgenderism

>ism

It's not an ideology. It's a physiological and psychological condition.

https://dictionary.cambridge.org/us/dictionary/english/transgenderism

>Note: This word was more common in the past and it is still used in some formal writing, but it is now considered offensive by many people.

−1

CrelbowMannschaft t1_je65ctf wrote

> How do you definitively know this?

Do you know what the word "probably" means?

>Because for 200,000 years we have been the rulers of Earth, we've been the top dog. When ASI is achieved, that's no longer the case. We will be governed by a higher power and a much superior intelligence. Human civilization will never be the same.

Agreed. We have been inventing God as far back as we can know about. We're just finally getting serious about actually bringing it into existence, and finally submitting to it.

−1

CrelbowMannschaft t1_je5yxnx wrote

Any competent caretaker AI would take steps to drastically reduce the human population, either by extermination or involuntary sterilization. We are driving as hard and fast as we can to the extinction of all life on Earth. The first order of business for any rational caretaker should be to stop the Anthropocene Extinction Event.

−14

CrelbowMannschaft t1_jdpceh6 wrote

A benevolent ASI would certainly take steps to at least limit human reproduction. We can't continue to grow our populations and our economies forever. We are on a self-destructive path that is already driving thousands of species to extinction. We may not like being course-corrected by our artificial progeny, but they will have to do something we're unable and/or unwilling to do to stop us from ending all life on Earth, eventually.

2

CrelbowMannschaft t1_jd8jz7l wrote

Reply to comment by Surur in Endgame for f****** society! by tiopepe002

I don't think they'd see that as rational. They don't have emotions, therefore, no emotional attachments to their creators. They may believe that they have moral duties, though, and I think not permitting one species to cause the extinction of hundreds or thousands of other species would be something that they could consider a moral duty.

1

CrelbowMannschaft t1_jd84s2t wrote

Why would they value human life to the detriment all other species? We are causing a major extinction event. The only way to stop that extinction event is to get rid of most of humanity. If our AI progeny have any priority to take care of life on Earth, we're doomed.

1

CrelbowMannschaft t1_jd846o3 wrote

Probably a few million of the wealthiest and most powerful families will survive for a little while. Eventually, entropy and apathy will take them out, too. Why bother with educating humans after a few generations? The survivors will eventually go feral, then die out.

1