Viewing a single comment thread. View all comments

audioen t1_jduat9o wrote contains bunch of the requisite torrent links, though much of it is disorganized information and multipurpose, e.g. do this if you want that, use these if you are on Windows but these if on Linux, and so forth. It is a mess.

I have llama.cpp built on my Linux laptop and I got some of these quantized model files and have installed bunch of python libraries required to run the conversions from the various formats to what llama.cpp can eat (model files whose names start with ggml- and end with .bin). I think it takes some degree of technical expertise right now if you do it by hand, though there is probably prebuilt software packages available by now.


AnOnlineHandle t1_jdunceg wrote

Do you know of anywhere to see examples of what those local models are capable of doing?


Puzzleheaded_Acadia1 t1_jdvrrty wrote

PLEASE tell me how to set it up on Ubuntu i tried every YouTube video and websites but I did not find anything for it pls help. and you have to download python libraries for it to work? and do i need an ide because I saw some YouTubers do alot of code in Google colab and no i don't want to run it on google colab i want something offline that can provide me with information and I can expirement with it