How did you create this GGUF?

#2
by ymcki - opened

I ran llama.cpp but it says DeciLMCausalLM is not supported.

#python3 convert_hf_to_gguf.py ~/DeciLM-7B-Instruct/ --outfile ~/DeciLM-7B-Instruct.f16.gguf --outtype f16
INFO:hf-to-gguf:Loading model: DeciLM-7B-Instruct
ERROR:hf-to-gguf:Model DeciLMForCausalLM is not supported

Sign up or log in to comment