Submitted by JamPixD t3_125sc2x in singularity
MichaelsSocks t1_je5lqfc wrote
Assuming we have a benevolent AI, of course. But if we have a misaligned ASI it could oppress or even exterminate us. Until we achieve ASI we just don't know what we'll get.
Yourbubblestink t1_je7e63q wrote
AI is being trained through internet searchh data and chat gpt trials. People are horrible in the internet, Zero chance we wind up with a benevolent AI
MichaelsSocks t1_je7g9ri wrote
A true AGI would need to learn the world beyond the internet. Which is why we'll need more than LLM's for AGI
Yourbubblestink t1_je7jb7c wrote
In theory. We are already seeing leaps in LLMs that suggest there’s a lot we don’t understand
CrelbowMannschaft t1_je5yxnx wrote
Any competent caretaker AI would take steps to drastically reduce the human population, either by extermination or involuntary sterilization. We are driving as hard and fast as we can to the extinction of all life on Earth. The first order of business for any rational caretaker should be to stop the Anthropocene Extinction Event.
MichaelsSocks t1_je5zrxv wrote
Maybe, maybe not. An ASI would be capable of things we can't even begin to comprehend. Maybe we think we're on the path to life on earth becoming extinct, but an ASI is able to find some way to prevent that while preserving humanity. The collective knowledge of every human who has ever lived is nothing compared to a super intelligent AI, so i'd be wary about those kinds of predictions.
CrelbowMannschaft t1_je64h7o wrote
We can preserve humanity while dramatically reducing our numbers. In fact, dramatically reducing our numbers is probably a necessary step in preserving us. We are an inherently self-destructive species. For my part, I refuse to reproduce.
MichaelsSocks t1_je6537v wrote
> In fact, dramatically reducing our numbers is probably a necessary step in preserving us.
How do you definitively know this? As I said, our knowledge is incredibly limited. An ASI may discover this idea to be false.
> We are an inherently self-destructive species. For my part, I refuse to reproduce.
Because for 200,000 years we have been the rulers of Earth, we've been the top dog. When ASI is achieved, that's no longer the case. We will be governed by a higher power and a much superior intelligence. Human civilization will never be the same.
CrelbowMannschaft t1_je65ctf wrote
> How do you definitively know this?
Do you know what the word "probably" means?
>Because for 200,000 years we have been the rulers of Earth, we've been the top dog. When ASI is achieved, that's no longer the case. We will be governed by a higher power and a much superior intelligence. Human civilization will never be the same.
Agreed. We have been inventing God as far back as we can know about. We're just finally getting serious about actually bringing it into existence, and finally submitting to it.
MichaelsSocks t1_je67mwe wrote
> Do you know what the word "probably" means?
And like I said, these assumptions are based on our limited scope of intelligence. An ASI with infinitely superior intelligence than ours will probably view our assumptions as retarded to be blunt.
Rofel_Wodring t1_je69bh8 wrote
It's sort of like listening to children come up with reasons why mommy and daddy torture them with vaccines and bedtime. And then using that as evidence that their planets plan to cook and eat them.
Most Doomers, especially the 'humans are a blight on mother nature omg' types, just want to do Frankenstein/Hansel and Gretel fanfiction. Pathetic.
Rofel_Wodring t1_je68uxq wrote
Please stop projecting your total lack of imagination onto higher intellects. 'Consume, consolidate, reproduce with no regards to the outside world except for how it thwarts you' is behavior we assign to barely-intelligent vermin. Smarter animals, to include humans, have motivations and strategies that go well beyond just making more copies of itself. And this is a trend that only gets more profound the higher up the intelligence ladder you go.
There's no reason to, and plenty of reasons not to, believe that a super-intelligence would suddenly reverse a trend we see in nature to have such simple, primitive motivations.
Viewing a single comment thread. View all comments