MuonManLaserJab

MuonManLaserJab t1_j9udmcp wrote

> EY tends to go straight to superintelligent AI robots making you their slave.

First, I don't think he ever said that they will make us slaves, except possibly as a joke at the expense of people who think the AI will care about us or need us enough to make us slaves.

Second, I am frustrated by the fact that you seem to think that only the short-term threats matter. What's a more short-term threat: nuclear contamination because of the destruction of the ZPP in Ukraine, or all-out nuclear war? Contamination is more likely, but that doesn't mean that we wouldn't be stupid to ignore the potentially farther away yet incredibly catastrophic outcome of nuclear war. Why can you not be worried about short-term AI issues but also acknowledge the possibility of the slightly longer term risk of superintelligent AI?

This is depressingly typical as an attitude and not at all surprising as the top comment here, unfortunately.

4