The biggest constraint to achieving true AGI, IMO, is going to be compute resources and the cost associated with running these massive models. GPT3 is a really impressive technology, but it's also still very limited and nowhere close to true AGI. And, it's currently prohibitively expensive and resource intensive to be rolled out at scale. A true AGI is going to be exponentially more expensive and resource intensive.
The first big breakthroughs in deep learning and neural networks happened in the 60's and 70's. But, deploying those models at scale was impossible given the compute resources at the time - and it wasn't until 2010/2011 that GPUs were fast enough to train deep learning models at scale.
I don't think it's going to take 40-50 years again for compute to catch up, but the fact of the matter is, it's not just going to be a simple "spin up more compute" as these models grow. There's always a balance between software and hardware - and the physical world is always going to be a limiting factor for hardware.
I wouldn't be surprised if we see another "AI winter" - where the research and the software exist, but the hardware constrains the ability to actually get to full AGI. The good news is, AI as it stands today, even without AGI, is really damn useful - and people are finding new and innovative ways to create value with what we already have. So, it's not going to be a full on "winter" for AI, just a stagnation in the ability to deploy new and more powerful models at scale.
murph1134 t1_j9tvt74 wrote
Reply to New agi poll says there is 50% chance of it happening by 2059. Thoughts? by possiblybaldman
The biggest constraint to achieving true AGI, IMO, is going to be compute resources and the cost associated with running these massive models. GPT3 is a really impressive technology, but it's also still very limited and nowhere close to true AGI. And, it's currently prohibitively expensive and resource intensive to be rolled out at scale. A true AGI is going to be exponentially more expensive and resource intensive.
The first big breakthroughs in deep learning and neural networks happened in the 60's and 70's. But, deploying those models at scale was impossible given the compute resources at the time - and it wasn't until 2010/2011 that GPUs were fast enough to train deep learning models at scale.
I don't think it's going to take 40-50 years again for compute to catch up, but the fact of the matter is, it's not just going to be a simple "spin up more compute" as these models grow. There's always a balance between software and hardware - and the physical world is always going to be a limiting factor for hardware.
I wouldn't be surprised if we see another "AI winter" - where the research and the software exist, but the hardware constrains the ability to actually get to full AGI. The good news is, AI as it stands today, even without AGI, is really damn useful - and people are finding new and innovative ways to create value with what we already have. So, it's not going to be a full on "winter" for AI, just a stagnation in the ability to deploy new and more powerful models at scale.