Viewing a single comment thread. View all comments

ReasonablyBadass t1_itts24o wrote

Current transformer architecture may need a few more tweaks for AGI to work, but I'd say it's close already.

6

porcenat_k t1_itu9urc wrote

Indeed. The few tweaks I’d say are continual learning and longer short term memory. Both are active research sub fields. All that’s left to do is scale model size which I consider to be way more than data. Human beings understand basic concepts and don’t need to read the entire internet for that. Because we have evolved bigger brains.

5

ReasonablyBadass t1_itubdlz wrote

>Human beings understand basic concepts and don’t need to read the entire internet for that.

We have years of training data via multiple high input channels before we reach that level though.

5