Submitted by [deleted] t3_yety91 in singularity
hducug t1_itzxjvn wrote
Reply to comment by OLSAU in Do you guys really think the AI won't just kill you? by [deleted]
Agreed the only way I can see agi become the end of humanity is evil owners. They can keep this silent and literally become god and no one can stop them.
OLSAU t1_itzz7kc wrote
Furthermore, those future AGI owners, more or less already owns everything else, and will plug psychoAI into that ... that is their stated goal for funding development.
Not to mention Pharma, Military, CIA etc. etc.
AGI simply is an existential threat much, much greater than nuclear weapons, because of it's inherent unpredictability, unlike nuclear weapons, and the mindset behind it's development.
Viewing a single comment thread. View all comments