Transformers
GGUF
English
mistral

OSError: TheBloke/dolphin-2.0-mistral-7B-GGUF does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.

#2
by ErnestBS - opened

Hello,

I'm getting this error when trying to load the model
OSError: TheBloke/dolphin-2.0-mistral-7B-GGUF does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.

This is the offending line
model = AutoModelForCausalLM.from_pretrained("TheBloke/dolphin-2.0-mistral-7B-GGUF", model_file="dolphin-2.0-mistral-7b.Q4_K_M.gguf", model_type="mistral", gpu_layers=50)

I tried passing the access token, but the error is the same.

Sign up or log in to comment