error in ollama
#1
by
Roronin
- opened
sfr-iterative-dpo-llama-3-8b-r.IQ4_NL.gguf
didn't start on ollamaollama create llama3-dpo -f Modelfile
it returns:
transferring model data
Error: invalid file magic
but sfr-iterative-dpo-llama-3-8b-r.Q4_K_M.gguf
works!
@Roronin hey there, thanks for the heads up!
I created these from b2879 in llama.cpp, but I don't immediately see any recent changes that might've caused this. Will investigate further and let you know if i find anything.
I tested the file against the same version of llama.cpp in terms of inference and it seems to work fine β can you share any more detail about that Modelfile
you're using? I'm not super familiar with ollama's format or what it encodes.
@Roronin Thanks for the edit! Will further look into the IQ4_NL (& perhaps other I-quant) variant(s)!