OSError: Can't load tokenizer for 'google/gemma-2b'

#25
by Keely0419 - opened

Running tokenizer = AutoTokenizer.from_pretrained('google/gemma-2b') fails and returns OSError: Can't load tokenizer for 'google/gemma-2b'.
However, strangely, tokenizer = AutoTokenizer.from_pretrained('google/gemma-7b') works successfully as expected.

I've updated my transformers to the latest version (transformers-4.39.0.dev0). I've also tried transformers-4.38.1 and still got the same error.

Screenshot 2024-02-28 at 16.17.58.png

Hi @Keely0419
The model is gated, once you have been granted access, you need to login with your HF account huggingface-cli login and load the tokenizer / model

Keely0419 changed discussion status to closed

Sign up or log in to comment