Viewing a single comment thread. View all comments

grossexistence t1_j7ofvq1 wrote

At this rate, Proto-AGI will be here by the end of the year or the first half of 2024.

24

squareOfTwo t1_j7ooiz6 wrote

just no, the rate is still to damn slow for that. Most of the "progress" is just training with yet unused data (human written text for GPT, Text-Image pairs for the stable diffusions of this world etc. This will end soon if no high quality data is left to train). The end of "scale" is near.

11

zendonium t1_j7ovw92 wrote

But surely that's all it takes? The human brain is just a multimodal network that processes language, visual, audio, and a bunch of other stuff.

Pay 10,000 Kenyans $2 a day to get more training data on more senses and train more networks. We'll have narrow AGIs in almost all areas. Just needs putting together with some clever insight from some genius.

6

Cryptizard t1_j7p26uc wrote

If that was true then we could just train a model on all the AI research we have and get a “narrow AGI” that makes AI models. Singularity next week. Unfortunately, that is not how it is.

4

visarga t1_j7q4313 wrote

If they make GPT-N much larger, it will take longer and cost more to train. Then we can only afford a few trials. Whether they are selected by humans or AI makes little difference. It's going to be a crapshoot anyway, nobody knows what experiment is gonna win. The slow experimentation loop is one reason not even AGI can speed things up everytime.

2

Maksitaxi t1_j7ojz2i wrote

How do you think proto-AGI would look like?

5

turnip_burrito t1_j7okin2 wrote

Probably a box.

Maybe painted black.

And able to understand enough concepts to write improved versions of some of its own code of we asked it to.

Maybe can write some new math proofs in a short and human readable way.

Maybe multimodal.

Large short term memory context window.

Able to update its model in real time for incoming new information.

Maybe running on more specialized hardware, or neuromorphic chips.

19

ecnecn t1_j7pfwdr wrote

>Probably a box.
>
>Maybe painted black.

hey, you signed a Non-Disclosure Agreement on this!

3

kaleNhearty t1_j7purfu wrote

Generative transformer models are not AGI, not even close. We're going to have to come up with some new methodology to handle multi-modality or maybe some kind of synthesis between several different models until we see some kind of Proto-AGI and that's decades away IMO.

2

Hands0L0 t1_j7pdcrw wrote

Not until we understand our own brains

0