Viewing a single comment thread. View all comments

timedacorn369 t1_jczscaf wrote

It's mentioned as open source. So it means I can get the model weights and run it locally if I want to right?

29

pixiegirl417 OP t1_jd04sxc wrote

36

BayesMind t1_jd8ps8g wrote

Is there an example script somewhere for how to run this? All I've seen is the heavy inference server example in the repo.

1

pixiegirl417 OP t1_jd8s4nc wrote

I haven't tried to run it locally since I don't have the hardware requirements, and haven't tried to find a way to do it.

However you can check my GitHub if you want to try the server attached inference API (I know it may not be what you're looking for).

1