Supernova_444

Supernova_444 t1_je6kc7b wrote

I think the only real constraints on an ASI would be what's physically possible. We don't really have any reason to consider that there could be a limit to machine intelligence, besides hardware. (And then we just get it to design better hardware.) Stuff that we know could be possible like nuclear fusion and FDVR would be trivial, it's only the more theoretical stuff like FTL travel that might be impossible.

But even a machine that's "only" 100x smarter than us would be a massive game changer. Even with narrow AI designed by humans, we were able to more or less solve the protien folding problem. Imagine what something that thinks at the speed of light would be able to do.

12

Supernova_444 t1_jcnguqe wrote

I'll bite. Why would an AGI/ASI just decide, without being instructed to, to emulate human behavior? And why would it choose to emulate cruelty and brutality out of every human trait? The way you phrased it makes it sound like you believe that mindless sadism is the core defining trait of humanity, which is an extremely dubious assertion. Even the "voluntary extinction" people aren't that misanthropic. Most people who engage in sadistic or violent behavior do so because of anger, indoctrination, trauma, etc. People who truly enjoy making others suffer just for the sake of it are are usually the result of rare, untreated neurological disorders. An AI may as well choose to emulate Autism or Bipolar Disorder.

I think that scenerios like this are useful as thought experiments to show that the power of AI isn't something to be taken lightly. I think it's one of the least likely situations, and I don't think you actually take it as the most likely possibility, based on the fact that you haven't committed suicide.

1