Submitted by submarine-observer t3_125vc6f in singularity
In this subreddit, many of us are eagerly anticipating advancements like fusion energy, FDVR, and the ability to upload our consciousness once ASI becomes a reality. However, have we considered the possibility that even ASI might not be able to make these things happen? Let's explore two potential scenarios:
Scenario 1: Certain technologies are simply not physically possible. There may be limitations imposed by the laws of our universe. For instance, even with ASI, we might not be able to travel faster than light. Similarly, it could be the case that uploading a human mind without causing harm to the individual is impossible.
Scenario 2: While these technologies might be possible, there could be an upper limit to intelligence. There's no doubt that AI will eventually surpass human intelligence, but perhaps our intelligence isn't as impressive as we think. Imagine if the upper limit of intelligence is only 100 times greater than human intelligence. An AI 100 times smarter than us might still be unable to solve the most complex problems.
In conclusion, we may find ourselves living in a world that resembles our current reality, with the only major difference being that we are no longer the most intelligent species on the planet.
TupewDeZew t1_je63h6x wrote
No, never. /s