Viewing a single comment thread. View all comments

Imaginary_Ad307 t1_iynk0ou wrote

Also the differential equation modeling interaction between neurons has been solved last November, clearing the path for very complex neural networks without bottlenecks due to numeric integration. So I am with you AGI is going to be a reality very soon.

47

dasnihil t1_iynmdk8 wrote

we also have people like joscha bach and yoshua bengio working on alternative networks like generative flow networks that learn by sampling whatever data available unlike deep learning that needs a lot of traning dataset, almost like how humans learn.

31

Roubbes t1_iyobzml wrote

So Joscha does real stuff aside from being an absolute god in Lex's Podcast?

9

dasnihil t1_iyp41sx wrote

im glad people like him are gatekeeping intelligence.

2

EntireContext OP t1_iynk9h4 wrote

I saw that headline but didn't go deep into it. It's real progress, not hype? How much efficiency gains? How long before they can implement it?

And aren't neural nets super complex already with all those billions of parameters?

3

Imaginary_Ad307 t1_iynkwt3 wrote

To my very limited understanding, you need huge servers to run complex neural networks because the interaction needs to be solved using numeric integration, with a symbolic solution this restriction disappear, opening the path to running this networks on less powerful servers, maybe even personal computers and phones.

10

manOnPavementWaving t1_iyo3sqa wrote

This doesn't hold for the networks currently in use, only if we want to more closely simulate human brains. There is no real indication yet that we can train these better or that they work better.

4

AvgAIbot t1_iyns2xs wrote

What about utilizing quantum computers? Or is that not applicable

2