Viewing a single comment thread. View all comments

Calm_Bonus_6464 t1_j1smhpe wrote

Once singularity is achieved its not going to matter what your political beliefs are, AI would be calling the shots whether you like it or not.

For the first time in 300,000 years we will no longer be the most intelligent form of life on Earth, and this means beings far more intelligent than us will decide humanity's future. How that happens is anyone's guess. A post singularity world will be so radically different from today modern economic theories and solutions will likely have no place.

7

OldWorldRevival OP t1_j1sn3uf wrote

> A post singularity world will be so radically different from today modern economic theories and solutions will likely have no place.

I think this puts too much magic into the AGI without thinking about specifics of actually dealing with things like the control problem and unequal access to the most powerful AGI tech.

I.e. an AGI aligned to one person could be very, very bad, and is in principle, totally possible. Not eliminating our current systems could totally lead to such a state too.

Imagine someone like Trump, but more calculated and cunning being the one that the AGI listens to.

2

Calm_Bonus_6464 t1_j1snyzn wrote

But we're not just talking about AGI here, Singularity would require ASI. Not just human level intelligence, but far beyond the intelligence capabilities of all humans who have ever lived. A being that intelligent would pretty easily be able to orchestrate political takeovers, or even destroy humans if it so desired.

2

OldWorldRevival OP t1_j1splv0 wrote

When I state "singularity requires political revolution to be of maximum benefit," I mean that the political changes have to come before the singularity.

Otherwise, the general benefits may be concentrated in the hands of an elite as the elite and those with resources continually lose the need for the masses with automation, as they're able to be self-sufficient with food, labor, etc.

But it could be worse, where an elite few control the AGI.

Or, lots of people become homeless, and then they're treated like homeless are now.

2

Calm_Bonus_6464 t1_j1srzke wrote

ASI does come before singularity. And ASI would solve much of those concerns. ASI has no reason to be any more benevolent to elites compared to anyone else. Elites cannot control a being that is far more intelligent than them. You're thinking AGI, not ASI, both have to happen before Singularity.

0

Upbeat_Nebula_8795 t1_j1snq7h wrote

yeah i dont see much point in the singularity if we dont help evolution create something better than us. only thing that’s like god

1