Viewing a single comment thread. View all comments

Nous_AI t1_jea6plh wrote

If we completely disregarded ethics, I believe we would have passed the point of Singularity already. The rate at which we get there is of a little importance. Consciousness is the most powerful force in the universe and I believe we are being reckless, far more reckless than we ever were with nuclear power. You fail to see the ramifications.

3

CertainMiddle2382 t1_jeb6e8i wrote

We are all mortals anyway.

What is the worse case scenario?

Singularity starts and turns all universe into computronium?

If it’s just that, so be it.

Maybe it will be thankful and build a nice new universe for us afterwards…

1

BigZaddyZ3 t1_jebbwqs wrote

Not everyone has so little appreciation for their own life and the lives of others, luckily. If you’re suicidal and wanna gamble with your own life, go for it. But don’t project your death wish on to everyone buddy.

1

iakov_transhumanist t1_jebk8mu wrote

We will die of aging if no intelligence solve aging

3

BigZaddyZ3 t1_jebkngn wrote

Some of us will die of aging you mean. Also there’s no guarantee that we actually need a super intelligent AI to actually help us with that.

2