Viewing a single comment thread. View all comments

EulersApprentice t1_ivaldx9 wrote

>To have AGI do anything more than kick the can down the road for more people to make decisions with how to deal with these problems, you’d have to be advocating for some sort of centrally planned AGI society. Or am I missing something?

What you're missing is the fact that the presence of AGI implies a centrally planned AGI society, assuming humans survive the advent. AGI is likely to quickly become much, much smarter than humans, and from there it would have little trouble subtly manipulating humans to do its bidding. So human endeavors are kind of bent to match the AGI's volition whether we like it or not.

8