not_particulary

not_particulary t1_jd51f0h wrote

There's a lot coming up. I'm looking into it right now, here's a tutorial I found:

https://medium.com/@martin-thissen/llama-alpaca-chatgpt-on-your-local-computer-tutorial-17adda704c23

​

Here's something unique, where a smaller LLM outperforms GPT-3.5 on specific tasks. It's multimodal and based on T5, which is much more runnable on consumer hardware.

https://arxiv.org/abs/2302.00923

28