Viewing a single comment thread. View all comments

ttkciar t1_j706e2s wrote

That's not a bad line of reasoning, but I posit that your leap from "deep learning systems will never be AGI" to "AGI is never going to happen" might be unfounded.

12

ReExperienceUrSenses OP t1_j70gkz7 wrote

My skepticism mainly comes from there not seeming to be a way to programmatically solve the grounding problem. I dont see von neumann, instruction set architectures being sophisticated/powerful enough in comparison to the only example we have.

3

dwkdnvr t1_j73d4n9 wrote

I agree that if AGI is achieved, it won't be through Von Neumann approaches.

But it's a pretty big leap from that to 'that means it's impossible to have a computational AGI'.

We don't know what future development in alternate computing paradigms are going to yield. It's not inconceivable that alternate forms of state management or interconnection or even hybrid analog/digital constructs might alter the tools available. We 'know' our brains don't really work like computers with separation of 'computation' from 'storage', but given how successful the current paradigm continues to be we haven't really pushed into investigating alternate possibilities.

My personal bet/assumption is that hybrid / cyborg approaches are what seems most likely. Genetic engineering of neural substrates combined with interfaces to more conventional computing capability seems feasible, although obviously there are many barriers to be overcome.

IMHO one of the most interesting avenues of speculation is whether AGI is even conceptually possible in a way that allows for direct replication, or whether a specific organism/instance will have to be trained individually. 'Nature' really hasn't ever evolved a way to pass down 'knowledge' or 'memories' - it passes down a genetic substrate and the individual has to 'train' it's neural fabric through experience.

2

pretendperson t1_j75ex8h wrote

The answer is by better emulating the outputs of all of the core human systems based on input. We need an endocrine system analog as much as we need a neural analog.

When we encounter something that invokes fear, our brain tells our body to create adrenaline, the adrenaline makes our brain and body go faster, the brain going faster and being more scared tells the body to make more adrenaline, and so on until attenuation of the cycle is triggered by removal of the threat.

We can't do any of this by focusing on neuronal analogs alone.

1