Submitted by EchoXResonate t3_114xv2t in singularity
Surur t1_j8yo2ed wrote
He's right though, as some-one else said recently - there is only 1 safe solution and millions of ways to F it up.
The main consolation is that we are going to die in any case, AI or no AI, so an aligned ASI actually gives us a chance to escape that.
So my suggestion is to tell him he cant get any more dead than he will be in 70 years in any case, so he might as well bet on immortality.
Viewing a single comment thread. View all comments