Submitted by ilikeover9000turtles t3_1277mm5 in singularity
genericrich t1_jee7j2r wrote
Reply to comment by [deleted] in ASI Is The Ultimate Weapon, And We Are In An Arms Race by ilikeover9000turtles
Killing humanity right away would kill them. Any ASI is going to need people to keep it turned on for quite a few years. We don't have robots that are able to swap servers, manage infrastructure, operate power plants, etc.
Yet.
The danger will be that the ASI starts helping us with robotics. Once it has its robot army factory, it could self-sustain.
Of course, it could make a mistake and kill us all inadvertently before then. But it would die too so if it is superintelligent it hopefully won't.
[deleted] t1_jeh2yw1 wrote
[deleted]
Viewing a single comment thread. View all comments