Viewing a single comment thread. View all comments

polawiaczperel t1_jed1e9h wrote

I was playing with Llama 7b, 13b, 30b, 65b, Alpaca 30b native and lora, but this seems to be much better, and it is only 13b. Nice! Will they share the weights?


pasr9 t1_jefqoii wrote

I'm more interested in them releasing the dataset used to fine tune it.