Viewing a single comment thread. View all comments

Frumpagumpus t1_j7bm6rk wrote

i think your worst case scenario is actually the best case scenario but I dont think you've really put much thought or justification into some of the properties you think that scenario will have.

I harp on these points in like every other comment, but here we go again...

> hive mind

no, the importance of data locality to computation means intelligence and especially self awareness will be NECESSARILY distributed, however, the extreme speedup in communication/thinking, maybe a million times faster, MIGHT (maybe probably would) mean that to humans, it would seem like a hive mind.

> the Ai could easily determine which minds are to be erased

my take is that post human intelligences will intentionally copy and erase themselves because it is convenient to do so. Human take on life and death is a cultural value associated with our brains being anchored to our bodies.

my guess would be that most of this copying and erasing would occur under one's own will. Obviously computer viruses would become analogous to a much much more dangerous version of modern biological viruses. However if I had to bet, while bad stuff would happen I would bet it would happen at a rate less than bad stuff currently happens at a population level in our society (any given individual would be much less likely to die in an accident or disaster).

1