Viewing a single comment thread. View all comments

DietDrDoomsdayPreppr t1_iwwg1ix wrote

I can't help but feel like we're exceptionally close to a model that can emulate intelligence, but that last piece is impossible to create due to the boundaries imposed on computer programming.

Part of what drives human intelligence is survival (which includes procreation), and to that end computers are still living off human intervention. AI isn't going to be born from a random bit flip or self-code that leads to self awareness, it's simply not possible considering the time needed for that level of "luck" and the limitations of computer processing that cannot grow and/improve its own hardware.

3

ledow t1_iwwu8u4 wrote

To paraphrase Arthur C. Clarke:

Any sufficiently advanced <statistics> is indistinguishable from <intelligence>.

Right until you begin to understand and analyse it. And that's the same with <technology> and <magic> in that sentence instead.

I'm not entirely certain that humans and even most animals are limited to what's possible to express in a Turing-complete machine. However I am sure that all computers are limited to Turing-complete actions. There isn't a single exception in the latter that I'm aware of - even quantum computers are Turing-complete, as far as we can tell. They're just *very* fast to the point of being effectively instantaneous even on the largest problems (QC just replaces time as the limiting boundary with space - the size of the QC that you can build determines how "difficult" a problem it can solve, but if it can solve it, it can solve it almost instantly).

And if you look at AI since its inception, the progress is mostly tied to technological brute force. I'm not sure that you can ever just keep making things faster to emulate "the real thing". In the same way that we can simulate on a traditional computer what a quantum computer can do, but we can't make it work AS a quantum computer, because it is still bound by time unlike a real QC. In fact, I don't think we're any closer to that emulation than we ever have been... we're just able to perform sufficiently complex statistical calculations. I think we'll hit a limit on that, like most other limitations of Turing-complete languages and machines.

All AI plateaus - and that's a probabilistic feature where you can get something right 90% of the time but you can't predict the outliers and can't change the trend, and it takes millions of data points to identify the trend and billions more to account for and correct it. I don't believe that's how intelligence works at all. Intelligence doesn't appear to be a brute-force incredibly fast statistical machine at all, but such a system can - as you say - appear to emulate it to a degree.

I think we're missing something still, something that's inherent in even the physics of the world we inhabit, maybe. Something that's outside the bounds of Turing-complete machines.

Because a Turing-complete machine couldn't, for example, come up with the concept of a Turing-complete machine, or give counter-examples of problems that cannot ever be solved by a Turing-complete machine, for instance. But a human intelligence did. Many of them, in fact.

8

warplants t1_iwxvgjb wrote

> Because a Turing-complete machine couldn't, for example, come up with the concept of a Turing-complete machine

Citation needed

3

DietDrDoomsdayPreppr t1_iwwxawn wrote

Dude. I wish we could both get stoned and talk about this all night, and I haven't smoked in a decade.

2

gensher t1_iwzplh6 wrote

Damn, I feel like I just read a paragraph straight out from Penrose or Hofstadter. Recursion breaks my brain, but feels like it’s the key to everything.

1