Viewing a single comment thread. View all comments

red75prime t1_iz1eaeh wrote

Integration of long-term memory and transformers. It will allow to reduce the size of transformer network. So, GATO successor will advance from slow robotic control to OKish robotic control and it will drop your bottle of beer with 1-5% probability, instead of 20% (or so) now. No, still not AGI as it will have limited lifelong learning (if any).

GPT-4 will be more of everything: better general knowledge, longer coherence, less hallucinations, better code generation, better translation, improved logical thinking (more so with "let's do it step by step" prompt) and so on and so forth. All in all, great evolutionary development of GPT-3 and ChatGPT, but no revolution yet.

Generative models will continue to improve. I wouldn't expect high-quality, high-resolution, non-trippy video in 2023 though. Maybe we'll get decent temporal consistency on a limited number of subjects that were specifically pretrained. Music synthesis probably will not advance much (due to expected backlash from music labels).

Neural networks based on neural differential equations may give rise to more dexterous and faster to train robots, but the range of tasks they can perform will be limited.

Maybe we'll see large language models with "internal monologue" module. I can't predict their capabilities and whether researchers will be comfortable going in this direction as those are getting dangerously close to "self-aware territory" with all of its dangers and ethical problems.

7