Submitted by AdorableBackground83 t3_11db8lk in singularity
CertainMiddle2382 t1_ja80jd2 wrote
Asking that in a Singularity channel is a bit ironic :-)
The very meaning of Singularity is that soon, something will come and make the future world incomprehensible.
Im very biased and I can’t find a way for AGI not to “quickly” devolve into computronium+von Neumann probes+hopefully virtual existence for us in that mess lol
IMO, computronium maximization is the very nature of self improving AI and anything different will require a huge (insurmountable?) effort.
Viewing a single comment thread. View all comments