Viewing a single comment thread. View all comments

ArgentStonecutter t1_isfurn3 wrote

This has nothing to do with the singularity. After the singularity billionaires will be in the same position as al the other normal humans in the face of the entities that control everything.

1

AdditionalPizza OP t1_isfvw73 wrote

If you have the exact blueprints of how the singularity will go down, then sure. But we have no idea if AGI is absolutely necessary for the singularity to occur yet. We don't even know if self awareness will be possible in AI, so it's possible a single entity could control an ASI and use it for whatever they want. We have no idea.

1

ArgentStonecutter t1_isfwwxn wrote

If a human level intelligence remains in control of society it's not the singularity. It's possibly better than the singularity, for the most of us, but "singularity" is not a synonym for the general topic of posthumanity and post-scarcity society. It's a specific phase change in society in which our kind of humanity, if it still exists, understands it no more than a dog does.

> we are entering a regime as radically different from our human past as we humans are from the lower animals > > https://edoras.sdsu.edu/~vinge/misc/singularity.html

1

AdditionalPizza OP t1_isg1q1y wrote

Yes I've seen that very old writing of it. There are several more modern view points on what a technological singularity could be, they include many different ways of achieving it, but they all conclude the same basic thing; Unfathomable change and uncontrollable runaway technological innovation.

Regardless, we can agree to disagree on that point. I was trying to avoid talking about AGI and self aware AI anyway.

1

ArgentStonecutter t1_isg2nus wrote

That's basically the foundational document of singularity theory.

What you're talking about here is not "Unfathomable change and uncontrollable runaway technological innovation". It's a very fathomable change that involves a society only minimally different from our own.

1

AdditionalPizza OP t1_isg4pbb wrote

>That's basically the foundational document of singularity theory.

Yeah I know. If you consider it a definition of a word, I can understand not wanting to change it. If you consider it a theory, then well theories evolve all the time.

But I know, I was absolutely not talking about the singularity. I mentioned it because the original comment was referring to it, and I said what they were referring to sounded less like post-singularity, and more like transformative AI. I was actually mostly avoiding talking about the singularity in this post, more about pre-singularity.

1

ArgentStonecutter t1_isg6097 wrote

You don't suddenly get the theory of evolution evolving into the theory of relativity.

1

AdditionalPizza OP t1_isg9xwj wrote

You're very set on a sci-fi writer from 1993's version being the absolute. He didn't even come up with the term, he just made a popular theory about it.

If you want to be so concrete on one man's theory, you should probably go with the original at least. Not just the first most popular. The entire definition was originally a rate of returns on tech that surpasses human comprehension. That's it, and I'm sticking with it.

1

ArgentStonecutter t1_isgavro wrote

> surpasses human comprehension

This is the bit you don't seem to be getting.

If our current human social structures are still in place, it's not the singularity.

1

AdditionalPizza OP t1_isgcvl7 wrote

The singularity is literally a point in time though. It's not an ongoing event. We possibly have our social structures > singularity > we no longer have our social structures.

I don't think you're understand what I'm saying. To be honest, I don't understand what your argument is either. I don't even know what we're debating at this point.

1

ArgentStonecutter t1_isgdnxd wrote

That's right. It's a locus in time we can't see beyond.

That's the point. It's not just more of the ongoing exponential growth in technology that we've been dealing with since the industrial revolution at least.

I don't think you're actually disagreeing with me any more.

1

AdditionalPizza OP t1_isgfk33 wrote

>I don't think you're actually disagreeing with me any more.

Honestly I don't think we ever were. Aside from whether or not it requires a self aware AI and the ways to achieve a singularity situation, and the definition of it.

We both agreed it's a moment in time we can't predict beyond. I had never stated anything less, the original commenter stated something differently and I disagreed with them.

1

ArgentStonecutter t1_isgfmb1 wrote

> Aside from whether or not it requires a self aware AI and the ways to achieve a singularity situation, and the definition of it.

I never even suggested that. I said that it requires a mind more powerful than a current human, but that could be enhanced and upgraded humans. But there is no reason to assume their roles would remain similar to those in Economy 1.0... odds are strongly against it. And it's not going to be the super-rich in general getting the risky implants.

If those minds are just tools under the control of the likes of Musk, though, that's not the singularity.

1

AdditionalPizza OP t1_isgin0q wrote

Oh, then I'm afraid our signals got mixed up somewhere along the line.

I do wonder if the singularity will affect those that refuse to take part in the technology before it. As in, some people choose to live off the grid, will they be left alone. I don't know, topic for another time I suppose.

1