Is this supposed to be usable with llama.cpp?
#1
by
spanielrassler
- opened
Hi,
I'm wondering if I'm on the wrong track by trying to use this with llama.cpp. I get the following error when trying:
error loading model: unexpectedly reached end of file
llama_init_from_file: failed to load model
main: error: failed to load model
Any suggestions or is this model just not compatible? If not, do you know of a Pyg one that is? Thanks!
spanielrassler
changed discussion status to
closed
alpindale
changed discussion status to
open
I was wondering why do the weights depend on koboldcpp? How does the conversion differ?