Submitted by [deleted] t3_yety91 in singularity
OLSAU t1_itzwmwi wrote
I have thought about it for a while, and I believe the real problem with AGI, is it's owners.
Imagine psychopaths creating a thinking machine, and plugging it into everything ... Dystopia is written all over that!
After that, at best, humanity become meat-puppets for the AI, alternatively just eradicated like vermin.
hducug t1_itzxjvn wrote
Agreed the only way I can see agi become the end of humanity is evil owners. They can keep this silent and literally become god and no one can stop them.
OLSAU t1_itzz7kc wrote
Furthermore, those future AGI owners, more or less already owns everything else, and will plug psychoAI into that ... that is their stated goal for funding development.
Not to mention Pharma, Military, CIA etc. etc.
AGI simply is an existential threat much, much greater than nuclear weapons, because of it's inherent unpredictability, unlike nuclear weapons, and the mindset behind it's development.
Viewing a single comment thread. View all comments