Viewing a single comment thread. View all comments

Calm_Bonus_6464 t1_j1snyzn wrote

But we're not just talking about AGI here, Singularity would require ASI. Not just human level intelligence, but far beyond the intelligence capabilities of all humans who have ever lived. A being that intelligent would pretty easily be able to orchestrate political takeovers, or even destroy humans if it so desired.

2

OldWorldRevival OP t1_j1splv0 wrote

When I state "singularity requires political revolution to be of maximum benefit," I mean that the political changes have to come before the singularity.

Otherwise, the general benefits may be concentrated in the hands of an elite as the elite and those with resources continually lose the need for the masses with automation, as they're able to be self-sufficient with food, labor, etc.

But it could be worse, where an elite few control the AGI.

Or, lots of people become homeless, and then they're treated like homeless are now.

2

Calm_Bonus_6464 t1_j1srzke wrote

ASI does come before singularity. And ASI would solve much of those concerns. ASI has no reason to be any more benevolent to elites compared to anyone else. Elites cannot control a being that is far more intelligent than them. You're thinking AGI, not ASI, both have to happen before Singularity.

0