Viewing a single comment thread. View all comments

ChronoPsyche t1_itfkn8o wrote

I hope you're right. Truly, would be amazing if we had text to feature film in 1 to 2 years. I don't see any reason to think you will be though.

AI growth comes in spurts and waves. We are in an AI summer right now. What's happening right now will slow down without some additional breakthroughs.

We gotta fix the memory problems we have and until we do, AI will be limited to short-term content generation. Really amazing short-term content generation, but short-term nonetheless.

The memory issue is not trivial. It's not a matter of better hardware. It's a matter of hitting exponential running time limits. We need either a much more efficient algorithm or a quantum computer. I'd presume we will end up finding a better algorithm first, but it hasn't happened yet.

1

visarga t1_itgu5bi wrote

Not exponential, let's not exaggerate. It's quadratic. If you have a sequence of N words, then you can have NxN pairwise interactions. This blows up pretty fast, at 512 words -> 262K interactions, at 4000 words -> 16M interactions. See why it can't fit more than 4000 tokens? It's that pesky O( N^2 ) complexity.

There is a benchmark called "Long Rage Arena" where you can check to see the state of the art in solving the "memory problem".

https://paperswithcode.com/sota/long-range-modeling-on-lra

1

ChronoPsyche t1_itgunqx wrote

Exactly what I am referring to. My bad, quadratic is what I meant.

1