Transformers
GGUF
English
mistral
Generated from Trainer

No tokenizer available?

#10
by dspyrhsu - opened

Hi there, I am using ctransformers and I am creating the GGUF-model like this:

model = AutoModelForCausalLM.from_pretrained(
"TheBloke/zephyr-7B-beta-GGUF",
hf=True)

Ater that, I would like to create the corresponding tokenizer like this:

tokenizer = AutoTokenizer.from_pretrained(model)

However, this gives me a "not implemented" error. How can I specify a tokenizer for this model?

Best regards and thanks for the great work!

is it not possible to use a non-GGUF version?
HuggingFaceH4/zephyr-7b-beta

Sign up or log in to comment