Viewing a single comment thread. View all comments

Frumpagumpus t1_j9akz1k wrote

i disagree with the premise. I think a human with normal intelligence and control of an egoless superintelligence is the most dangerous. But I am also extremely skeptical of the concept of egoless, general, superintelligence being a thing.

in fact I would go further and say my conclusion seems obvious. and that using a human as a seed value for a superintelligence would if anything be more likely to result in superintelligence which was "aligned" with our values (although I doubt it makes much of a difference)

1