Comments

You must log in or register to comment.

Sandbar101 t1_j90044c wrote

Well the good news is that when the singularity is achieved, politics and climate change will no longer be a problem.

Most of us are optimistic about the development of the singularity. Assuming we survive to see it, AI will create what’s called a post-scarcity economy. Effectively utopian living standards.

I won’t deny it’s scary. And when you put it in perspective it makes the next 20 years seem kind of terrifying, and it would very well end up being the death of us all.

But take pride in that. Thousands of generations of human history and you happen to be living in possibly the most important one ever. The one where the human race ends, or ascends to something incredible that makes Star Trek look primitive.

Live your childhood. Be proud of yourself and keep up with understanding the potential of AI. Listen to some Lex Friedman podcasts, he talks about it a lot with people developing it in the field.

The point is, until the world stops, its going to keep going. So should you.

23

TheSecretAgenda t1_j900url wrote

Calm down. No one knows what is going to happen. The fact that you are aware puts you way ahead of most other people. Pay attention, be flexible, look for career opportunities that will be hard to automate like good paying trades. Relax, everything will be alright.

7

Sol_Hando t1_j900vcp wrote

Worrying about something as theoretical as the singularity is a waste of your mental effort. It’s a fun topic to ponder, and interesting to see what people say about it. The reality is nobody here has any knowledge about a technological singularity, whether it is likely or even possible. Worrying about it is akin to worrying about an alien invasion. Possible, but completely theoretical.

7

PandaCommando69 t1_j90avkm wrote

Read the Culture novels by Ian M Banks. You'll (probably) feel better. I personally think things are going to turn out alright (though the ride might be bumpy for a bit). You're living in a moment in time that our ancestors couldn't even have dreamed of in their wildest imaginations. It's really extraordinary if you stop to think about it for a minute. If things go right it means cures for all disease, the end of aging, limitless energy, new exotic materials for every conceivable purpose, true morphological freedom, full dive VR, and on and on. We are on the cusp of the ascension of humanity into something so much more. Keep your fingers crossed kiddo, and try not to worry too much in the meantime.

7

Wroisu t1_j8zx6o6 wrote

Why is it scary? If things go right you’ll have the free-time to learn and do as you please, with the resources to do them to their fullest extent.

In the meantime, learn all you can about these topics and how they relate to things as a whole, like medicine, climate change, civil rights, politics etc.

If you want books to read about these things, I recommend reading the book Look to Windward or Player of Games. They delve into what a post-singularity society might look like (under ideal circumstances)

2

BigZaddyZ3 t1_j8zzpfs wrote

No, you’re confusing post-scarcity with the singularity. Post-scarcity (if it ever happens) would occur before the singularity. The singularity is the point where technology begins to evolve itself so rapidly that humans no longer can control it anymore. Life on Earth will be forever transformed in ways that are unimaginable to the human mind. It most likely signals the end of the “human dominance” era on Earth.

The good news for OP tho, is that, since no one knows what’ll happen to humanity after that point, there’s no point in stressing over it too much.

0

Wroisu t1_j8zzz3h wrote

I’m not confusing them, I know my definitions. I specified post-singularity because the books I gave recommendations for are based on the premise of humanoids being in a symbiotic relationship with hyper intelligent artificial intelligences called (Minds).

Post singularity implies post scarcity, of which we already are in some aspects (like food) we just don’t distribute it properly.

2

BigZaddyZ3 t1_j90081t wrote

Any book claiming to know what happens post singularity is illegitimate and merely just mindlessly speculating at best tbh.

−5

Wroisu t1_j900ex1 wrote

It’s not claiming to know, it’s doing what any good science fiction does and extrapolates what we know to logical conclusions to create interesting narratives, and do commentary on the current social, technological & political climates etc.

The culture novels are known for that, don’t knock it until you’ve read it.

5

BigZaddyZ3 t1_j9016fv wrote

The entire point of the singularity is that all of our current knowledge and logic will have long been rendered irrelevant at that point. Technological progression would have long surpassed human comprehension. That’s the entire point. Humans today can’t comprehend what comes after the singularity. Do you see the problem with “extrapolating” our current understanding in this scenario?

Also do you really think it’s wise to base your understanding of such a complex topic on a clearly fictional novel made most likely for entertainment purposes?

−3

Wroisu t1_j901zs1 wrote

The point of the novel(s) is to explore those complex topics, I’m not saying that that’s what it’ll be like but that it gives a perspective on what it could be like.

Similar to star trek & it’s commentary on capitalism, or the three body problem and it’s explanation for the Fermi paradox ad infinitum.

As far as the technology beyond our comprehension, that technology as high and mighty as it may be, will still be based on physical principles we know of.

And even the technology that’s born out of principles we’ve yet to discover will come out of the unification of things we already know, like general relativity and quantum mechanics.

You could create extremely hard materials by manipulating the strong nuclear force over large distances, this would be extremely exotic by our standards but not outside the realm of possibility. Stuff like that is what the singularity would allow, is it impossible to comprehend? Not really.

3

BigZaddyZ3 t1_j902rqs wrote

There’s still a lot that we don’t know about the universe tho… and you’re assuming that there’s no way to change or alter the principles of the Earth as well. Say a super-intelligence system were able to develop a weapon that could alter Earth’s gravitational pull. Suddenly the current laws of physics go out the window. You’re thinking too small. Like I said, there’s still a lot that we don’t understand about the universe. Thinking the singularity will be “business as usual” is what happens when you try to base your understanding of it off fictional novels…

−1

Wroisu t1_j9037wv wrote

For the earth, the Gravitational Binding Energy is about 2x10^32 Joules, or about 12 days of the Sun's total energy output, Mr. Big Thinker.

There’s no way an AI would randomly be able to control that amount of energy without us knowing of the mechanisms used to control such energy, let alone seeing the structures built to move that energy around in a useful way.

Not understanding how physics work & thinking that AI will suddenly rewrite it one day is what you get when you browse an echo chamber for your information on such things.

2

BigZaddyZ3 t1_j903o4e wrote

>>There’s no way an AI would randomly be able to control that amount of energy without us knowing of the mechanisms used to control such energy, let alone seeing the structures built to move that energy around in a useful way.

Why not? Are you dumb enough to assume AGI will never surpass human cognitive abilities? Please tell me you’re not that stupid…

1

Wroisu t1_j9042i4 wrote

Cognitive ability doesn’t translate to immediate R&D, you could think up a trillion ways to do something, each better than the last, but you still have to build the equipment that does the thing you want to do research on etc. for every iteration of your idea.

That doesn’t mean that it won’t be quick, but that these things aren’t magic - as you seem to be suggesting immense intellect would be.

Eventually you get to the point where Isaac Asimov’s “any sufficiently advanced technology is indistinguishable from magic” holds true, but that doesn’t happen over night.

1

BigZaddyZ3 t1_j904qxu wrote

It does happen overnight in a technological singularity tho. That’s why it’s also sometimes referred to as the “intelligence explosion”.

1

Iffykindofguy t1_j902h8k wrote

LOL at you claiming theres a hard timeline to any of this, much less something as absurd as post-scarcity being a requirement for a singularity

1

BigZaddyZ3 t1_j903fni wrote

I didn’t give a hard time line tho… A hard timeline would be me giving specific dates and shit. I didn’t. You seriously need to improve your reading comprehension skills bruh.

It’s just pretty much universally agreed on by actual experts that if we ever achieve post-scarcity, it’ll before any singularity occurs. No other order even makes sense. There’s no guarantee humans will even still be around post-singularity. And the singularity isn’t even needed in order to reach post scarcity. So do the math there genius…

0

turnip_burrito t1_j9044d9 wrote

Singularity can (and is looking like it will) happen before post scarcity. It may even cause post scarcity.

3

BigZaddyZ3 t1_j904h6v wrote

I don’t agree because we’ll more than likely reach the level of AI needed for post-scarcity before we reach the level needed for a singularity to occur.

0

turnip_burrito t1_j90508k wrote

I guess it depends on how quick the takeoff is. When do you think we'll see AGI?

2

BigZaddyZ3 t1_j9058y6 wrote

In my opinion, it’d be foolish to try and pin it to an exact date. But I’d say we’re on path to reach it maybe in the 2040s possibly.

−1

Iffykindofguy t1_j903mrg wrote

Please provide this ample evidence by experts explicitly stating that post scarcity would occur before a singularity by requirement.

1

BigZaddyZ3 t1_j90473x wrote

Lmao do you actually think I care what you think enough to go through the trouble of doing that? 😂😂Fuck off, I’m literally about to go to bed. I’m not gonna write a fucking research essay for you. Go do your own research if you care that much.

−1

tms102 t1_j90a4ws wrote

It is clear you don't know what you're talking about.

3

BigZaddyZ3 t1_j90au3a wrote

>>The first person to use the concept of a "singularity" in the technological context was John von Neumann.[5] Stanislaw Ulam reports a 1958 discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue". [6] Subsequent authors have echoed this viewpoint.[3][7]

>>The concept and the term "singularity" were popularized by Vernor Vinge first in 1983 in an article that claimed that once humans create intelligences greater than their own, there will be a technological and social transition similar in some sense to "the knotted space-time at the center of a black hole",[8] and later in his 1993 essay The Coming Technological Singularity,[4][7] in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.

>> Some scientists, including Stephen Hawking, have expressed concern that artificial superintelligence (ASI) could result in human extinction.

>>The other prominent prophet of the Singularity is Ray Kurzweil. In his book The Singularity is Near, Kurzweil basically agrees with Vinge but believes the later has been too optimistic in his view of technological progress. Kurzweil believes that by the year 2045 we will experience the greatest technological singularity in the history of mankind: the kind that could, in just a few years, overturn the institutes and pillars of society and completely change the way we view ourselves as human beings.

>>The technological singularity—or simply the singularity[1]—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.

You were saying? How exactly can we achieve a post-scarcity human society after the singularity when the most prominent proponents of the singularity believe we won’t even be able to control technology by that point and that it will mark the end of human era in one way or another? Use your fucking brain for fuck’s sake..

0

Iffykindofguy t1_j91isdm wrote

HAHAHAHAHAHAHAHAHA

0

BigZaddyZ3 t1_j92zsmz wrote

The exact type of low-IQ response I’ve come to expect from you tbh..

1

Iffykindofguy t1_j936bg5 wrote

Its funny because I dont think we actually disagree that much Im just annoyed by how certain you seem to think these things will be

1

BigZaddyZ3 t1_j938yb6 wrote

Well, in my defense, I’m just giving my opinion based on everything I’ve learned about the subject over the years. Just like we all do in this sub all the time. It’s not a crime to be confident in your opinion. And from the conversations we’ve had so far, you aren’t that much different when it comes to that.

But yeah, I was only giving my take on how things are likely to unfold. I wasn’t saying it was a 100% guarantee. If that’s what you thought then I see where some of the tension and confusion stem from. I wasn’t trying to say that it was an undeniable certainty. Just that what I described seems most likely to occur (imo).

1

Iffykindofguy t1_j93ufu8 wrote

We all do it yes, my point is how you communicate it is the problem. It does come off as a certainty that your word is law.

1

Iffykindofguy t1_j902crv wrote

Humans adapt, well figure it out. A) who knows if it will happen b)who knows if whatever disaster you're imagining is what does happens C) Its normal to feel this way, the world does look fucking insane so don't take me saying this stuff as trying to invalidate your feelings but theyre probably just a bit excessively high right now.

2

prolaspe_king t1_j902hnt wrote

This reads like an adult trying to role-play as a teenager.

2

love0_0all t1_j903hc1 wrote

It's not good to focus on what you can't control. This is a mystery, and it's starting to get good. Stick around and find out what happens.

2

SnooHabits1237 t1_j909esw wrote

There’s always a point in living. Don’t overthink it like that

2

magnetar_industries t1_j9018bc wrote

The best thing to happen is if your world view _does_ fall apart. Because at least then you will be able to consciously _choose_ the factors that will go into building your next worldview. Be sure to give yourself enough time and space and compassion to allow this process to unfold.

The dominant worldviews in practice today are woefully inadequate to the predicaments we are currently facing. It's the next higher level problem as what Einstein said: “We can't solve problems by using the same kind of thinking we used when we created them.” In other words, we can't solve the problems caused by our current worldview using only the types of thinking that can be generated by our existing worldview. We need a new _kind_ of thinking.

And this is what is instilling this sense of dread and despair in people who have been paying attention. Believe me, many of us who have been singularity- (and/or collapse-) aware for at least a few years have gone through the stages of grief.

But living in an inflection point of human (and maybe universe) history is going to be filled with amazing and awe-inducing things, as well as what we currently think of as unfathomable horrors. I find a bigger worldview inspired by Buddhism to be helpful for me in learning how to accept, and then hopefully navigate, what's coming.

1

turnip_burrito t1_j9048ib wrote

Stop freaking out. Calm down and go hang out with your friends.

1

technofuture8 t1_j908vy2 wrote

How would you like to see the universe?

1

radioOCTAVE t1_j90ca07 wrote

Enjoy the time you have my friend.

1

teflchinajobs t1_j90cw6a wrote

One day at a time. That’s how you live. “Stop and smell the roses”. Take pleasure in the small things. There’s no point to dwell on that which you have no control over.

And who knows, the singularity might not be as bad as you think it will.

1

Space-Doggity t1_j90d3vx wrote

What's the point of living? The singularity isn't an inherent disaster, it's a wildcard. You live here, now, at the precipice of either greatness or great failure, near a possible ascension from humanity's failed, half-baked attempts at civilization. If all goes well, you may have the chance to spend the next 1,000,000 years exploring the galaxy, tweaking your cyborg body and/or relaxing in a lifelike yet highly customizable simulation - if humans manage to protect their collective interests from the domineering assholes & charlatans who would misuse those AI. Plenty of specialists are aware of the importance of the control problem and are working on solutions -- I'm inclined to believe they'll solve it with less trouble than is often assumed.

If I had to give a reason for anyone to keep living, it'd be to live long enough to know for a fact that the future is as horrible as you predict before giving up on it.

1

Hunter62610 t1_j90gsqk wrote

You live in one of the most mindblowing and advanced periods of human evolution ever. In a year you can learn what one could only dream of learning in a lifetime 200 years ago. You have technology that would convince those of ancient times that you were a wise seer, you live in a perfectly tailored environment, and are able to eat more and better food than kings of old.

It is true that we face the consequences of our own actions, and are privy to both the horrors and joys of our modernity, but I try to remember that even if we are the most fucked generation to ever live, we are also the most powerful and best equipped. Untold suffering will happen in our lifetimes, it's undeniable, but if you give up now, you will have no impact on what comes to pass. This is the start of our story, of those who saved humanity from itself.

1

just-a-dreamer- t1_j90hz6l wrote

Welcome to adult life, it sucks. But it's awesome to join the club at the same time.

Your parents and grandparents pretended to have life figured out. They did not, nobody does. We all just make the best out what we have. We control the controlables in our life, and react to the things thrown at us as we make our way.

There is and never was a point in time when life is certain. Famine, diseases, wars, nuclear wars, murder, car accidents....

The singularity adds just another threat on the list. Yet also the opportunity to a post scarcity society and longevity. The risk of anhilition is balanced out with the promise of paradise on earth.

Good deal, nothing to be afraid of.

1

Stippes t1_j90my8h wrote

That's pretty much the same discussion that was had in philosophy in the early 20th century.

Since then, we've come from nihilism (your point of view) to existentialism (make your own point to stay alive) to absurdism (there isn't any point, but we can enjoy life despite that). (All this is very simplified)

Seek solace in the answers that were given before.

1

tedd321 t1_j92iekb wrote

Or everything just stays mostly normal. Don’t lose your mind over this until it happens at your place of work

1

Desperate_Food7354 t1_j90lwuq wrote

If you want life to be normal why don't you go to the plains of Africa with nothing but sticks and stones.

0

SnooRadishes6544 t1_j9045jc wrote

Get Bitcoin. AI will take care of the rest. Wtf are you scared of

−1