Viewing a single comment thread. View all comments

GayHitIer t1_j5qkj72 wrote

People are beggining to realize the singularity is about to happen.

It will most likely occur around 2050-2060, around there this subreddit will skyrocket like the AI's intellience will.

If Moore's law and AI intelligence keep following the graph that is, but it is most likely going to go even faster than what we expect right now.

4

just_thisGuy t1_j5qp3g1 wrote

Much sooner than 2050-2060, I’m thinking around 2045 but it’s not a one year event or anything, my guess is by 2035 the changes will be almost magical already. I think the singularity will be a span starting in 2035 or so and lasting decades, it will be singularly up on singularity.

12

GayHitIer t1_j5qpen6 wrote

Sure the technology will be around there, but for it to change the world as we know, it will take some time for people and politics... don't even get me started with ethics, people will delay this as much as they can sadly.

Doesn't really matter cause LEV will probably also come out around that time to buy us more time.

3

just_thisGuy t1_j5qqigp wrote

I frankly don’t think politics will matter or anything on Earth, I think AGI, augmented humans, ASI and whatever else will just move off world and do what they please. I don’t mean this in a bad way, but in a good way. The future is off world.

1

ElwinLewis t1_j5qt65g wrote

What exactly do you mean by move off world?

0

dingle__dogs t1_j5qzvy6 wrote

space. not earth. extraterrestrial. low earth orbit.

2

SoylentRox t1_j5sl9tk wrote

Probably just the Moon. Someone could set up biotech clinics there that offer therapies directly from an AGI (that doesn't care about drug patent law) or strip mine the place without environmental impact paperwork or licensing

1

Evil_Patriarch t1_j5rjcmi wrote

For the good of the world, I hope that reddit is dead long before 2050

3

CheriGrove t1_j5r5sok wrote

Is there a solid metric for "when" singularity "happens"? I don't entirely understand the concept, I came into this sub thinking it was about black holes.

1

SoylentRox t1_j5slrzy wrote

The singularity is a prediction of exponential growth once AI is approximately as smart as a human being.

So you might hear in the news that tsmc has cancelled all chip orders except for AI customers, and there are zero new devices anywhere that are recently made with advanced silicon in them.

You might see in the news that the city of Shenzhen has twice as much land covered with factories as it did last month.

Then another month and it's doubled again.

And so on. Or if the USA has the tech for themselves similar exponential growth.

At some point you would probably suddenly see whoever has the tech launching tens of thousands of rockets and at night you would see lights on the Moon..that double every few weeks how much surface is covered.

This is the metric: anything that is clear and sustained exponential growth driven by AI systems at least as smart as humans.

Smart meaning they score as well as humans on a large set of objective tests.

There are a lot of details we don't know - would the factories in the Moon even be visible at night or do the robots see in IR - but that's the signature of it.

0

CheriGrove t1_j5sm5jn wrote

"As smart as" is difficult to measure and judge. I think by 1980s standards, we might already be at something like a singularity as they might have judged it.

1

SoylentRox t1_j5smhx9 wrote

Yes, but, intelligence isn't just depth, it's breadth.

In this case, to make possible exponential growth, AI has to be able to do most of the steps required to build more AI (and useful things for humans to get money).

Right now that means AI needs to be capable of controlling many robots, doing many separate tasks that need to be done (to ultimately manufacture more chips and power generators and so on).

So while chatGPT seems to be really close to passing a Turing test, the papers for robotics are like this : https://www.deepmind.com/blog/building-interactive-agents-in-video-game-worlds

And not able, yet, to actually control this: https://www.youtube.com/watch?v=XPVC4IyRTG8 . (that boston dynamics machine is kinda hard coded, it is not being driven by current gen AI)

I think we're close and see for the last steps people can use chatGPT/codex to help them write the code, there's a lot more money to invest in this, they can use AI to design the chips for even better compute : lots of ways to make the last steps take less time than expected.

0

CheriGrove t1_j5sn4z8 wrote

It's fascinating, existential, hopeful, and worrisome to the n'th degree, here's hoping its post scarcity utopia, rather than something Orwell could never have fathomed.

1

SoylentRox t1_j5sl0ob wrote

You understand the idea of singularity criticality right? Currently demonstrated models (especially RL based ones) are ALREADY better than humans at key tasks related to AI like:

  1. Network architecture search
  2. Chip design
  3. Activation function design

I can link papers (usually Deepmind) for each claim.

This means AI is being accelerated by AI.

2050-60 is a remote and unlikely possiblity. Like expecting covid to just stop with China and take 10 years to reach the US, if the year were 2020.

1