Viewing a single comment thread. View all comments

BigZaddyZ3 t1_j9016fv wrote

The entire point of the singularity is that all of our current knowledge and logic will have long been rendered irrelevant at that point. Technological progression would have long surpassed human comprehension. That’s the entire point. Humans today can’t comprehend what comes after the singularity. Do you see the problem with “extrapolating” our current understanding in this scenario?

Also do you really think it’s wise to base your understanding of such a complex topic on a clearly fictional novel made most likely for entertainment purposes?

−3

Wroisu t1_j901zs1 wrote

The point of the novel(s) is to explore those complex topics, I’m not saying that that’s what it’ll be like but that it gives a perspective on what it could be like.

Similar to star trek & it’s commentary on capitalism, or the three body problem and it’s explanation for the Fermi paradox ad infinitum.

As far as the technology beyond our comprehension, that technology as high and mighty as it may be, will still be based on physical principles we know of.

And even the technology that’s born out of principles we’ve yet to discover will come out of the unification of things we already know, like general relativity and quantum mechanics.

You could create extremely hard materials by manipulating the strong nuclear force over large distances, this would be extremely exotic by our standards but not outside the realm of possibility. Stuff like that is what the singularity would allow, is it impossible to comprehend? Not really.

3

BigZaddyZ3 t1_j902rqs wrote

There’s still a lot that we don’t know about the universe tho… and you’re assuming that there’s no way to change or alter the principles of the Earth as well. Say a super-intelligence system were able to develop a weapon that could alter Earth’s gravitational pull. Suddenly the current laws of physics go out the window. You’re thinking too small. Like I said, there’s still a lot that we don’t understand about the universe. Thinking the singularity will be “business as usual” is what happens when you try to base your understanding of it off fictional novels…

−1

Wroisu t1_j9037wv wrote

For the earth, the Gravitational Binding Energy is about 2x10^32 Joules, or about 12 days of the Sun's total energy output, Mr. Big Thinker.

There’s no way an AI would randomly be able to control that amount of energy without us knowing of the mechanisms used to control such energy, let alone seeing the structures built to move that energy around in a useful way.

Not understanding how physics work & thinking that AI will suddenly rewrite it one day is what you get when you browse an echo chamber for your information on such things.

2

BigZaddyZ3 t1_j903o4e wrote

>>There’s no way an AI would randomly be able to control that amount of energy without us knowing of the mechanisms used to control such energy, let alone seeing the structures built to move that energy around in a useful way.

Why not? Are you dumb enough to assume AGI will never surpass human cognitive abilities? Please tell me you’re not that stupid…

1

Wroisu t1_j9042i4 wrote

Cognitive ability doesn’t translate to immediate R&D, you could think up a trillion ways to do something, each better than the last, but you still have to build the equipment that does the thing you want to do research on etc. for every iteration of your idea.

That doesn’t mean that it won’t be quick, but that these things aren’t magic - as you seem to be suggesting immense intellect would be.

Eventually you get to the point where Isaac Asimov’s “any sufficiently advanced technology is indistinguishable from magic” holds true, but that doesn’t happen over night.

1

BigZaddyZ3 t1_j904qxu wrote

It does happen overnight in a technological singularity tho. That’s why it’s also sometimes referred to as the “intelligence explosion”.

1