Submitted by Sure_Cicada_4459 t3_127fmrc in singularity
FaceDeer t1_jef9cg6 wrote
Reply to comment by Jeffy29 in The only race that matters by Sure_Cicada_4459
Indeed. A more likely outcome is that a superintelligent AI would respond "oh that's easy, just do <insert some incredibly profound solution that obviously I as a regular-intelligent human can't come up with>" And everyone collectively smacks their foreheads because they never would have come up with that. Or they look askance at the solution because they don't understand it, do a trial run experiment, and are baffled that it's working better than they hoped.
A superintelligent AI would likely know us and know what we desire better than we ourselves know. It's not going to be some dumb Skynet that lashes out with nukes at any problem because nukes are the only hammer in its toolbox, or whatever.
fluffy_assassins t1_jefjf6h wrote
Measure me think of sentient yogurt.
Viewing a single comment thread. View all comments