SIGINT_SANTA

SIGINT_SANTA t1_j9c262c wrote

Without AI we’re left with the prospect of increasing lifespan and welfare, improving medicine, human genetic engineering, colonization of the solar system and galaxy, new knowledge, and a few trillion years of starlight left before the universe goes dark.

That sounds pretty damn good to me.

I think your gloomy attitude about climate change is also misplaced. In nearly all developed countries, carbon emissions per capita are flat or falling.

1

SIGINT_SANTA t1_j96eukp wrote

The scale of AIs threat of annihilation is much higher than that of nuclear weapons. And the incentives to improve AI are much stronger than the incentive to make more dangerous nuclear weapons. And the challenge of preventing AI proliferation is much harder than the challenge of preventing proliferation of nuclear weapons.

I also think it’s unlikely that a nuclear war would actually cause human extinction. It would certainly kill a ton of people (perhaps almost all). But even in the worst case scenarios it seems very likely that a few million would survive in New Zealand or some other remote location.

And do you really hate capitalism so much that you would kill your family and friends to end it? Really?

1

SIGINT_SANTA t1_j966hxh wrote

It seems very optimistic to assume AI will mean the end of capitalism or abundance for all.

The most likely outcome seems to be it destroys us in pursuit of some objective we gave it. If you don’t think that’s a possibility I suggest you read what some of the people working on AI alignment have written.

But if by some absolute miracle that doesn’t happen, AI is going to be greatest tool of power concentration we’ve ever created. Whoever controls powerful AI would basically run the world. The default outcome is that some large company or government will have their hands on the levers.

1