Using Ctransformers for inference gives an error
#2
by
cx819
- opened
Using Ctransformers for inference gives the below error:
RuntimeError: Failed to create LLM 'llama' from 'C:\Users\22279.cache\huggingface\hub\models--SanctumAI--Meta-Llama-3-8B-Instruct-GGUF\snapshots\f688151a21ac4496648f183682ac25772b110658\meta-llama-3-8b-instruct.Q8_0.gguf'.
It actually downloaded the .gguf file from repo, but returned this error message. What exactly is the reason?
Hey @cx819 , sorry for late response. I don't use ctransformers, but I've updated model since that, maybe it now works (if you still need this model), at least in llama.cpp and in Sanctum it works, and so should work in ctransformers. Let me know if you need any help.
AndreyBest
changed discussion status to
closed
AndreyBest
changed discussion status to
open