Submitted by submarine-observer t3_125vc6f in singularity
In this subreddit, many of us are eagerly anticipating advancements like fusion energy, FDVR, and the ability to upload our consciousness once ASI becomes a reality. However, have we considered the possibility that even ASI might not be able to make these things happen? Let's explore two potential scenarios:
Scenario 1: Certain technologies are simply not physically possible. There may be limitations imposed by the laws of our universe. For instance, even with ASI, we might not be able to travel faster than light. Similarly, it could be the case that uploading a human mind without causing harm to the individual is impossible.
Scenario 2: While these technologies might be possible, there could be an upper limit to intelligence. There's no doubt that AI will eventually surpass human intelligence, but perhaps our intelligence isn't as impressive as we think. Imagine if the upper limit of intelligence is only 100 times greater than human intelligence. An AI 100 times smarter than us might still be unable to solve the most complex problems.
In conclusion, we may find ourselves living in a world that resembles our current reality, with the only major difference being that we are no longer the most intelligent species on the planet.
Supernova_444 t1_je6kc7b wrote
I think the only real constraints on an ASI would be what's physically possible. We don't really have any reason to consider that there could be a limit to machine intelligence, besides hardware. (And then we just get it to design better hardware.) Stuff that we know could be possible like nuclear fusion and FDVR would be trivial, it's only the more theoretical stuff like FTL travel that might be impossible.
But even a machine that's "only" 100x smarter than us would be a massive game changer. Even with narrow AI designed by humans, we were able to more or less solve the protien folding problem. Imagine what something that thinks at the speed of light would be able to do.