Submitted by cocopuffs239 t3_110fxxs in singularity
[removed]
Submitted by cocopuffs239 t3_110fxxs in singularity
[removed]
It might make sense to talk about how to prepare for the short term automation that will come, but I don't see much sense in preparing for the singularity.
​
Singularity might come in 20 years, 50, or 200. Maybe we'll be alive at that time, maybe not.
Is there any safe assumptions we can make that make things better for us after singularity without making things worse before it?
What if it takes 200 years and we die before it?
I think that the best approach to take with tech is take in to account short term tech coming (More automation, new kind of content creation, etc...) but not go too far, because going to far is never safe.
​
I mean, you talk about live extention. That's great. Maybe en 50 years everyone will be virtually inmortal. But it's totally possible that we see something far more conservative, like a 10 years increase in live expectancy.
​
Hope for the better, prepare for the worst (?) haha.
​
And above all don't risk your long term safety for short term gains hoping for singularity to make things good for you in the future (Like...spending all your money now, because you think money will have no sense in the future)
​
​
Also, talking about short/medium term things have the advantage of making you sound a tiny bit less crazy (There is crazy things coming! But you can already show some kind of preview of those crazy things)
​
Let things like LEV and ASI and Singularity where they belong, in this sub :P
Reversing aging will be here within 20 years mate
!RemindMe 20 years
I will be messaging you in 20 years on 2043-02-12 18:50:31 UTC to remind you of this link
CLICK THIS LINK to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) ^(delete this message to hide from others.)
^(Info) | ^(Custom) | ^(Your Reminders) | ^(Feedback) |
---|
So what your saying is that we really don't know if the singularity will happen. Thus we should refrain from communicating this to other people (longer term stuff)?
I think I've been so excited for this for years that if it does happen I want everyone I know to be 'prepared' (as prepared as one could be). I'm a firm believer that ignorance is bliss but I'd rather know and be in chaos over it.
Honestly, I wouldn’t go out of your way, and I don’t wanna sound selfish when I say that, I say this because humans have naturally been resistant to change down through history, and yet the reactionary sentiments always go away with a little time. You’re better off just looking after yourself and those close to you in life until things take off.
If you see an opportunity with friends maybe bring it up, but ultimately it doesn’t matter if people approve of the progress, because progress happens whether humanity likes it or not.
Just remember that reactionary sentiments always lose, you can’t stop progress and change.
My loved ones aren’t really interested in AI, so they kinda just brush off the idea
Do you think it's like that because they don't have a full grasp on what it means?
That, but also they just don’t want to hear it. They’re not tech enthusiasts at all. Some of them it also causes a type of existential anxiety so they don’t like thinking about it.
To me, it is clear AI will advance rapidly, but exactly how rapidly and with what particular impacts on society, I have little clue. How do you prepare for that?
i can relate,
You can't prepare for the singularity. If you believe in a collapse, you can prepare for this. I personally don't. Even if it was certain, I can't be bothered. You can save, invest, and be frugal, but if UBI arrives, it would have been for nothing. My friend Fred says that the singularity is like nuclear fusion, always decades away. I don't really know. An infinite rate increase seems physically impossible because of physical constraints.
TheSecretAgenda t1_j88xo0d wrote
Just whisper "The singularity is near" in their ear and run away.