Submitted by awcomix t3_11xtkv0 in Futurology

This is just a a thought experiment based on how fast technology changes are occurring at the moment. If you’re familiar with the concept of the technological singularity, you know that the speed of change will be exponential. Basically a speed at which our minds can’t easily comprehend.

Some people may have varying definitions of what the singularity is. For the purposes of this post, let’s say it means that computers become some version of ‘self aware’ and start deciding what to create, make, improve on. Opening a flood gate of breakneck development of new technology. Or in other words from the Wikipedia page on the Technological Singularity,

“an upgradable intelligent agent will eventually enter a "runaway reaction" of self-improvement cycles”

I ask this question because I can’t fathom the implications and how it might change our world. Why this date? Mostly because it’s still a few years away but close enough to plan/think about. Let’s pretend it’s going to be around the first week of December 2025, give it take a couple of weeks.

To be clear I’m not basing the date on a scientific hypothesis but rather creative reasons, IE, this post.

16

Comments

You must log in or register to comment.

awcomix OP t1_jd4rgz6 wrote

Hopefully this post is ok here and not better suited as a writing prompt. But i feel like we had better psychologically start preparing for it.

I’ll start my own guess:

When it happens we will tirelessly argue and debate about the nature of consciousness and what it means to be self aware. Many will believe it’s overblown and machines essentially can’t think for themselves. That consciousness is limited to humans and to a lesser degree some other species. Meanwhile new political factions that believe and support this new sentient AI will emerge. Religious groups will denounce the tech as against god’s will and ban followers from partaking in it in any shape or form. The political parties that support it will try and gain ground demonstrating the validity of the technology and how we can work with it to improve the world. These notions will be largely dismissed, feared and not trusted. The arguments will become moot after a year or two. As it becomes obvious that while we’ve been arguing about the semantics of intelligence, consciousness, and what certain peoples ‘gods’ think, the tech has become to eclipse everything else in our world. It’s now doing things that we can’t even comprehend and communicating in its own language with other systems. This will cause a panic that even the political backers can’t quell. Leading to dramatic and rushed banning of the technology. By this stage it will be too late. Some systems will be shut down but it will live on in smaller and limited systems. After that who knows…

6

TemetN t1_jd5c7lt wrote

This seems to imply some sort of foom if I'm reading it right, in which case alignment would be the only really significant thing you could do in preparation, besides ensuring living that long. Honestly, I tend to consider this the least probable of the major proposed runups to the singularity given the number of potential bottlenecks and the current focus of research.

​

On the plus side, if aligned then foom would also likely deliver by far the fastest results - with the world effectively revolutionized overnight.

5

awcomix OP t1_jd5h7ls wrote

Thanks for teaching me a new term FOOM. I had to look it up. I’m curious about other run up scenarios that you mentioned.

2

TemetN t1_jd5jmsq wrote

Top of my head? Apart from that, the other two big ones are the argument that the rate of progress is exponential in general, and AI's integration will further improve it. And Vinge's superhuman agents idea, which posits that we can't predict the results of AI R&D once it hits the point of being beyond human capabilities.

I tend to think that either of those is more likely (or rather, the first is inevitable and the second is hard to predict), and that we're in the runup to a soft takeoff now.

1

NecessaryCelery2 t1_jd6wwa6 wrote

I'm pretty sure it has started.

And I have no idea what to do. Economic and political chaos, not sure if when I can retire. Going through the motions at work. And I can not predict anything about the future.

5

Aceticon t1_jd7egs8 wrote

Start doing lots of stretching and flexibility exercises in preparation to kissing my ass goodbye.

5

[deleted] t1_jd4tdgt wrote

[removed]

4

Luxury_Dressingown t1_jd5c5ao wrote

That would hold true right up until the singularity happens, but surely the point of the singularity is that once we hit that point, tech goes beyond human understanding, and we lose control of our affairs to an intelligence beyond our own.

2

goldygnome t1_jd61lhg wrote

If we survive it, the singularity will probably be given a date in hindsight. However, i doubt that living though it will be an overnight change. I think some people are mistakenly including sentient AI as a requirement for the singularity. In such a scenario sentience could suddenly appear and infect advance systems round the world overnight.

I don't see why the singularity couldn't occur over a period of time without computers becoming sentient and then one day it seems to be everywhere. Gradually, then suddenly as Hemingway put it. The main requirement for the singularity is that the future becomes unpredictable.

I would argue that what we are seeing happening now in the AI sector qualifies as a symptom of the singularity having started. At this point it might be wiser to start thinking about ensuring a soft landing for yourself for whenever the carnage arrives at your occupation in the next X years.

3

awcomix OP t1_jd6pala wrote

That’s a good point about including sentience. That’s why when it happens people will debate endlessly about what is sentience anyway etc. for the sake of this argument I will say that an AI’s self awareness would probably look unrecognisable to us. But would have to include the AI realising it can do things and then choosing to do certain things without outside interference.

2

PenSpecialist4650 t1_jd8xc34 wrote

I prefer “dumb” tech that requires a certain level of interfacing with it manually. My car is a 6 speed with actual buttons, I use a turntable, Alexa is not allowed in the house, my fridge gets cold and has a light that turns on when I open it. I also went into a career that will be impossible for AI to take over. I don’t know what the future holds based on how fast AI is progressing, but I can safeguard my future to a certain extent by stepping off the automation treadmill as much as possible and avoiding the use of technological advancement whenever it’s reasonable to do so. My systems and methods work so I don’t see the need to change it at least until we get some clarity as to what is exactly going to become of this AI revolution.

3

Candid_Caregiver_872 t1_jd9t12e wrote

Hello fellow luddite, whats the career and your favorite play through record side please?

1

KnightOfNothing t1_jd6ifp4 wrote

honestly there's not much that i as an individual could do as my access to resources is insignificant and i just not smart enough to put myself on the benefitting side let alone the side that'll be propelled to the top of society by it.

i think i would just be relieved that the singularity is happening so soon because even if my access to the technology it creates is limited/non existent at least my dreams/ambitions go from impossible to possible.

1

awcomix OP t1_jd6pdwl wrote

I feel the same way to be honest. It’s like when you watch a zombie movie and fool yourself you’d be a survivor too. When in reality most people are getting bitten.

2

Johnisfaster t1_jd6p8r2 wrote

If Im not mistaken one of the aspects of The Singularity is VR that is indistinguishable from reality to the point where people can’t tell what is real anymore. We are pretty far from that considering VR thats totally indistinguishable would have to be patched right into your brain.

1

Dziadzios t1_jd7c3sc wrote

What's the point of that? If it's for my entertainment - I would know. If they need me, specifically - them they can scan me and just run AI copy in the simulation.

1

Johnisfaster t1_jd83566 wrote

The point is that people will want VR that looks perfect. If we ever actually achieve it it would be followed by an existential crisis.

1

ovirt001 t1_jd9te5s wrote

Hold on for dear life and hope that it goes well. 2025 is obviously too close for humanity to adapt so there's very little an individual can do. While you mention that it's a made up estimate, I expect the singularity will come much sooner than the predicted 2045. AI is progressing substantially faster than computers did and we can only hope that our ability to integrate with it keeps pace. For reference, the original prediction assumed the pace of progress would track with Moore's law.

1

Mercurionio t1_jd7pu3u wrote

Prepare to die. That's it.

But singularity won't occur ever. Although if it does, we will die immediately so why care about that.

−1