Comments

You must log in or register to comment.

TheSecretAgenda t1_j88xo0d wrote

Just whisper "The singularity is near" in their ear and run away.

8

ElvinRath t1_j88rexo wrote

It might make sense to talk about how to prepare for the short term automation that will come, but I don't see much sense in preparing for the singularity.

​

Singularity might come in 20 years, 50, or 200. Maybe we'll be alive at that time, maybe not.
Is there any safe assumptions we can make that make things better for us after singularity without making things worse before it?

What if it takes 200 years and we die before it?
I think that the best approach to take with tech is take in to account short term tech coming (More automation, new kind of content creation, etc...) but not go too far, because going to far is never safe.

​

I mean, you talk about live extention. That's great. Maybe en 50 years everyone will be virtually inmortal. But it's totally possible that we see something far more conservative, like a 10 years increase in live expectancy.

​

Hope for the better, prepare for the worst (?) haha.

​

And above all don't risk your long term safety for short term gains hoping for singularity to make things good for you in the future (Like...spending all your money now, because you think money will have no sense in the future)

​

​

Also, talking about short/medium term things have the advantage of making you sound a tiny bit less crazy (There is crazy things coming! But you can already show some kind of preview of those crazy things)

​

Let things like LEV and ASI and Singularity where they belong, in this sub :P

5

kaiww77 t1_j88wntw wrote

Reversing aging will be here within 20 years mate

7

cocopuffs239 OP t1_j891q8h wrote

So what your saying is that we really don't know if the singularity will happen. Thus we should refrain from communicating this to other people (longer term stuff)?

I think I've been so excited for this for years that if it does happen I want everyone I know to be 'prepared' (as prepared as one could be). I'm a firm believer that ignorance is bliss but I'd rather know and be in chaos over it.

1

HeinrichTheWolf_17 t1_j8970vf wrote

Honestly, I wouldn’t go out of your way, and I don’t wanna sound selfish when I say that, I say this because humans have naturally been resistant to change down through history, and yet the reactionary sentiments always go away with a little time. You’re better off just looking after yourself and those close to you in life until things take off.

If you see an opportunity with friends maybe bring it up, but ultimately it doesn’t matter if people approve of the progress, because progress happens whether humanity likes it or not.

Just remember that reactionary sentiments always lose, you can’t stop progress and change.

5

AvgAIbot t1_j89jzzo wrote

My loved ones aren’t really interested in AI, so they kinda just brush off the idea

5

cocopuffs239 OP t1_j89lsk1 wrote

Do you think it's like that because they don't have a full grasp on what it means?

3

AvgAIbot t1_j89nzcn wrote

That, but also they just don’t want to hear it. They’re not tech enthusiasts at all. Some of them it also causes a type of existential anxiety so they don’t like thinking about it.

2

Kule7 t1_j891l4a wrote

To me, it is clear AI will advance rapidly, but exactly how rapidly and with what particular impacts on society, I have little clue. How do you prepare for that?

3

LowSalad t1_j89otl8 wrote

i can relate,

2

No_Ninja3309_NoNoYes t1_j89s90r wrote

You can't prepare for the singularity. If you believe in a collapse, you can prepare for this. I personally don't. Even if it was certain, I can't be bothered. You can save, invest, and be frugal, but if UBI arrives, it would have been for nothing. My friend Fred says that the singularity is like nuclear fusion, always decades away. I don't really know. An infinite rate increase seems physically impossible because of physical constraints.

2