Tokenizer doesn't exist
#19
by
vegarab
- opened
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "mistralai/Mistral-7B-v0.3"
tokenizer = AutoTokenizer.from_pretrained(model_id)
Provided example in the README yields a
OSError: Can't load tokenizer for 'mistralai/Mistral-7B-v0.3'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'mistralai/Mistral-7B-v0.3' is the correct path to a directory containing all relevant files for a LlamaTokenizerFast tokenizer.
error.
Can you please provide updated instructions for how to access the model's tokenizer?