Fhbob1988
Fhbob1988 t1_j9n1udj wrote
Reply to comment by just-a-dreamer- in Why the development of artificial general intelligence could be the most dangerous new arms race since nuclear weapons by jamesj
Nukes require multiple humans to make multiple extremely difficult decisions to end the world. ASI can make the decision all on it’s own and we have no idea if it would be a difficult choice for it.
Fhbob1988 t1_j9n2d1z wrote
Reply to comment by just-a-dreamer- in Why the development of artificial general intelligence could be the most dangerous new arms race since nuclear weapons by jamesj
That’s the reason AI is more dangerous. Your own argument for AI is also the argument why it’s more dangerous. Mutually assured destruction has kept humanity safe. The other guy knows they can’t press the button without killing themselves as well. ASI could kill us all without a second thought if not aligned properly.