BrBronco

BrBronco t1_je7dwg1 wrote

Why everyone assumes that AI will be the first perfect software that won't ever malfunction? It's like getting together to see if we can build a supernova in a lab.

A complex AI capable of running human society would be a being completely beyond our comprehension already that we would definitely not know or be able to control.

1

BrBronco t1_je7d3uu wrote

An AI orders of magnitude more intelligent than us would likely not care about us not ask our permission if it concluded it should be our benevolent rulers.

I think it says a lot about us that we assume that a super intelligent being would care so much about us.

1