Viewing a single comment thread. View all comments

icedrift t1_j15lr43 wrote

Any model north of like 2bil parameters isn't worth the hassle of running locally. Even Ada sized models require a ton of vram and ram. Use a cloud compute service like Azure or Paperspace

1