Fhbob1988 t1_j9n1udj wrote
Reply to comment by just-a-dreamer- in Why the development of artificial general intelligence could be the most dangerous new arms race since nuclear weapons by jamesj
Nukes require multiple humans to make multiple extremely difficult decisions to end the world. ASI can make the decision all on it’s own and we have no idea if it would be a difficult choice for it.
Viewing a single comment thread. View all comments