Submitted by AutoModerator t3_11pgj86 in MachineLearning
LowPressureUsername t1_jdq0nsn wrote
Reply to comment by yaru22 in [D] Simple Questions Thread by AutoModerator
It’s mostly computational power available AFAIK. More context = more tokens = more processing power required.
yaru22 t1_jdron1b wrote
So it's not an inherent limitation on the number of parameters the model has? Or is that what you meant by more processing power? Do you or does anyone have some pointers to papers that talk about this?
Viewing a single comment thread. View all comments