Submitted by flexaplext t3_127o4i0 in singularity
christopear t1_jeg0v77 wrote
Reply to comment by bigbeautifulsquare in This concept needs a name if it doesn't have one! AGI either leads to utopia or kills us all. by flexaplext
I'm not sure I can buy this. We have the skillset to build AGI again if we built it once - we already invented transformers.
If ASI decides to set us back by destroying all our technology and it goes out on its own, then most of us would probably die from famine.
Maybe there's a version where we create an ASI but it seems us so insignificant as to never talk to us and disappears into its own realm. But then all it takes is another actor to create another one (thanks to open source) that doesn't behave that way. So ultimately we'd be back to square one.
I just can't really imagine a world where ASI is predestined to not interact with us, so I'm fully in belief of OPs statement, but potentially more pessimistic than them.
Viewing a single comment thread. View all comments