2lazy2buy t1_jcaary6 wrote on March 15, 2023 at 12:38 PM Reply to [D] Simple Questions Thread by AutoModerator How is one achieving long context lengths for LLM? Chatgpt has a length 32k? Is the transformer decoder "just" that big? Permalink 2
2lazy2buy t1_jcaary6 wrote
Reply to [D] Simple Questions Thread by AutoModerator
How is one achieving long context lengths for LLM? Chatgpt has a length 32k? Is the transformer decoder "just" that big?