Viewing a single comment thread. View all comments

Trout_Shark t1_jd7gt7g wrote

I'm no rocketsurgeon, but asking an AI to remove 75% of humanity seems like a bad idea. What if it thinks that means removing 75% of each of us. We might just end up as a bunch of heads in jars. Futurama style. I guess that would sort of be a version of the singularity.

5

Gubekochi t1_jd7kqwj wrote

"What if the super intelligence is actually stupider than ChatGPT currently is?"

​

For real? That's your concern?

2

just-a-dreamer- t1_jd7j7om wrote

You could make the majority of humans sterile. That works itself out within few years.

1

Mercurionio t1_jd7jvfc wrote

And what's the point? We are basically developing the tech to kill ourselves. And it's happening right now.

PS: it was a rhetorical question

1

Trout_Shark t1_jd7k08m wrote

No one would survive that scenario. This thought experiment has gone really dark quickly.

1

Marshall_Lawson t1_jd7qpae wrote

it was pretty fucking dark from the beginning what OP posted

2

Trout_Shark t1_jd7rpnq wrote

Agreed. Just casually killing or sterilizing 6 billion people didn't seem like a bad idea to OP. That's pretty fucked up.

2

Marshall_Lawson t1_jd7savs wrote

reminds me of Britta from Community lol

"I can excuse murdering 3/4 of the population of humanity, but I draw the line at forced sterilization!"

2

Trout_Shark t1_jd7tf5d wrote

LOL. I'm guessing it was a young kid. They deleted the post quickly as soon as it it went south on them. AI in the hands of morons is not something I am looking forward to.

1