Submitted by JamPixD t3_125sc2x in singularity
MichaelsSocks t1_je5zrxv wrote
Reply to comment by CrelbowMannschaft in Would it be a good idea for AI to govern society? by JamPixD
Maybe, maybe not. An ASI would be capable of things we can't even begin to comprehend. Maybe we think we're on the path to life on earth becoming extinct, but an ASI is able to find some way to prevent that while preserving humanity. The collective knowledge of every human who has ever lived is nothing compared to a super intelligent AI, so i'd be wary about those kinds of predictions.
CrelbowMannschaft t1_je64h7o wrote
We can preserve humanity while dramatically reducing our numbers. In fact, dramatically reducing our numbers is probably a necessary step in preserving us. We are an inherently self-destructive species. For my part, I refuse to reproduce.
MichaelsSocks t1_je6537v wrote
> In fact, dramatically reducing our numbers is probably a necessary step in preserving us.
How do you definitively know this? As I said, our knowledge is incredibly limited. An ASI may discover this idea to be false.
> We are an inherently self-destructive species. For my part, I refuse to reproduce.
Because for 200,000 years we have been the rulers of Earth, we've been the top dog. When ASI is achieved, that's no longer the case. We will be governed by a higher power and a much superior intelligence. Human civilization will never be the same.
CrelbowMannschaft t1_je65ctf wrote
> How do you definitively know this?
Do you know what the word "probably" means?
>Because for 200,000 years we have been the rulers of Earth, we've been the top dog. When ASI is achieved, that's no longer the case. We will be governed by a higher power and a much superior intelligence. Human civilization will never be the same.
Agreed. We have been inventing God as far back as we can know about. We're just finally getting serious about actually bringing it into existence, and finally submitting to it.
MichaelsSocks t1_je67mwe wrote
> Do you know what the word "probably" means?
And like I said, these assumptions are based on our limited scope of intelligence. An ASI with infinitely superior intelligence than ours will probably view our assumptions as retarded to be blunt.
Rofel_Wodring t1_je69bh8 wrote
It's sort of like listening to children come up with reasons why mommy and daddy torture them with vaccines and bedtime. And then using that as evidence that their planets plan to cook and eat them.
Most Doomers, especially the 'humans are a blight on mother nature omg' types, just want to do Frankenstein/Hansel and Gretel fanfiction. Pathetic.
Viewing a single comment thread. View all comments