Viewing a single comment thread. View all comments

Cult_of_Chad t1_j14lfs6 wrote

That's not the only definition of singularity. My personal definition of a technological singularity is a point in time when we're experiencing so many black swan events that the future becomes impossible to predict at shorter and shorter timescales. The event horizon.

We're definitely there as far as I'm concerned.

17

JVM_ t1_j14oxlx wrote

Same.

There seems to be an idea that the singularity needs to declare itself like Jesus returning or become a product release like Siri or a Google home.

There's a lot of space between no AI -> powerful AI (but not the singularity) -> the singularity.

Like you said, as the singularity approaches it becomes harder and harder to see the whole AI picture.

10

AdditionalPizza t1_j14xn9p wrote

>My personal definition of a technological singularity

​

>The event horizon.

I mean, you can have your own personal definition if you want, but that makes no sense. Not trying to sound rude or anything. An event horizon is not the same thing as a singularity. That's not having your own definition, that's just calling one thing another thing for no reason, specifically because we have definitions for both of those things already.

I agree with the comparison of being at or beyond an "event horizon" in terms of AI. But the singularity is an infinitely brief measure of time in which the moment we reach it, we have passed it. That moment, by actual definition, is when AI reaches a greater intelligence than all collective human intelligence. It probably won't even be a significantly impactful moment, it will just be a reference. We look at it now as some grand moment, but it isn't. It is just a moment where it's impossible for humanity to predict anything beyond it because an intelligence greater than all of ours exists, so we can't comprehend the outcome at this time.

The individual capacity of a human to not be able to predict what comes tomorrow has no bearing on whether or not the singularity has passed. Even if all human beings trying to predict what will come tomorrow are wrong, that still is not the singularity. It's a hypothetical time in the future that based on on today, right now, we know 100% we cannot make a prediction beyond because it's mentally impossible as a direct result of our brains being incapable.

It's interesting to consider that we may never reach the moment of a technological singularity either. If we merge with technology and increase our own intelligence, we could forever be moving the singularity "goal posts" similar to how an observer sees someone falling toward a black hole forever suspended, yet the subject falling felt a normal passage of time from event horizon to singularity. We may forever be suspended racing toward the singularity, yet at the same time having reached and surpassed it.

7

Cult_of_Chad t1_j14yew4 wrote

>An event horizon is not the same thing as a singularity.

I never said it was. I said we've crossed the event horizon, which puts us 'inside' the singularity.

>I mean, you can have your own personal definition

I didn't come up with it, Kurzweil did as far as I know.

>That moment, by actual definition, is when AI reaches a greater intelligence than all collective human intelligence

There's no 'actual' definition. It's a hypothetical/speculative.

7

AdditionalPizza t1_j1515v3 wrote

>I said we've crossed the event horizon, which puts us 'inside' the singularity.

That is essentially the same thing I claimed you said. The event horizon is normal times, you would unknowingly cross that barrier. In a physical sense, that would mean time slowing to an observer watching. I agree we are likely past that barrier/threshold in that more technological break throughs happen in shorter and shorter timeframes and eventually (the moment of singularity) there is a hypothetical infinite amount of technology being created AKA impossible for us to comprehend right now. But being within the bounds of the event horizon does not mean being inside of a singularity.

>I didn't come up with it, Kurzweil did as far as I know.

He didn't invent the comparison to physics, but that's besides the point. His definition is exactly what I stated. And I was referencing your comment directly, where you said you have your own personal definition...

>There's no 'actual' definition. It's a hypothetical/speculative.

There quite literally is an exact definition, and it isn't speculation. I'm not sure where you're getting that from, but it's a term that is widely used but this sub misuses it continually. It is a hypothetical thing, but not a speculative definition.

4

Cult_of_Chad t1_j151nsv wrote

>There quite literally is an exact definition

There have been multiple definitions used for as long as the subject has been discussed. AI is not even a necessary component.

6

AdditionalPizza t1_j154zp8 wrote

>AI is not even a necessary component.

For one, we are talking directly relating to AI. Even without AI, it means a technology that is so transforming that we haven't yet anticipated its impact (something like femtotech?). That could also arguably be some kind of medical break through that changes our entire perspective on life, say total immortality or something. Doesn't matter it's irrelevant to the discussion.

Second, the only definition is in direct comparison to the term used in physics, by which you aren't "inside" of a singularity the moment you cross the even horizon. I'm not trying to be overly direct or rude here, but you can't just use examples from physics to describe this and expect it to make sense when you've misused the terms.

From your original comment:

>My personal definition of a technological singularity is a point in time when we're experiencing so many black swan events that the future becomes impossible to predict at shorter and shorter timescales

Your thought process behind increasing occurrences of black swan events is perfectly acceptable as passing the event horizon. I like that reference, I've used it before. But crossing an event horizon does not equal being inside of a singularity. The technological singularity is a blip in time, not something you sit around in and chill for a while like we currently are in the "space between singularity and event horizon."

Anyway, that's about enough from me on the subject. I hope I didn't come off as rude or anything.

3

magnets-are-magic t1_j15dl5t wrote

I’m not the person you replied to but just wanted to say I appreciate the info you shared. I didn’t find it rude. Very interesting stuff!

4

AdditionalPizza t1_j15l4ir wrote

Thanks, I try to not be too wordy in comments which can make me sound like much more of an asshole than I intend to come across as. It's just a definition that has been skewered, and while the distinction isn't a hug difference, it's important so we don't get people claiming we're "in the singularity" right now. You're either pre-singularity, or post-singularity. There's no "in" and it's probably not going to be as of significant "event" as several things preceding it, and many many things following it.

2

oldmanhero OP t1_j15qd15 wrote

Just to be clear, this is not the definition I am using. The definition I am using is the point at which humanity can no longer "keep up" with the pace of technological change. That is a fuzzy concept, and as such not a point-like moment in time.

I'd hoped that much was obvious from the initial post, since I talked explicitly about the inability of institutions to keep pace.

2

AdditionalPizza t1_j15w7ty wrote

You could use other terms, such as Transformative AI. It describes the exact situation you're expressing. I don't want to sound like a nitpicking idiot or anything, but it's an important distinction that the singularity is in fact a moment and that we're either pre-singularity or post-singularity. You can make the argument that we're already post singularity, I'd probably disagree, but the opinion is your own.

I was just clarifying because it pops up in this sub often that people have this idea of the singularity and to be honest I'm not sure where that idea is coming from other than maybe being a feedback loop within this sub and similar online discussions that began as a misinterpretation of why we use the word singularity for a specific use-case.

Of course you're free to ignore me altogether haha, to each their own.

2

Gaudrix t1_j154nkx wrote

Yeah, I think people misconstrue technological singularity and AI singularity. It has nothing to do with not going backwards or any other constraint. Technology can always be destroyed and lost. The entire planet can be destroyed any instant.

The technological singularity was first and explains the confluence of different technologies that reach a stage where they begin to have compounding effects in progress, and there is an explosion of progress and trajectory.

The AI singularity specifically refers to the point AI becomes sentient and transitions into AGI. At which point we have no clue what the repercussions are after the creation of true artificial consciousness. Especially considering if it has the ability to self improve and on shorter and shorter time tables.

We are living through the technological singularity, and when they look back 100 years from now they'll probably put the onset somewhere in the late 90s or early 2000s. Things are getting faster and faster with breakthroughs across many different sectors due to cross-pollination of technological progress.

4