not able to download the model

#2
by jilijeanlouis - opened
OSError: Can't load tokenizer for 'PrunaAI/cognitivecomputations-dolphin-2.9-llama3-8b-256k-AWQ-4bit-smashed'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'PrunaAI/cognitivecomputations-dolphin-2.9-llama3-8b-256k-AWQ-4bit-smashed' is the correct path to a directory containing all relevant files for a LlamaTokenizerFast tokenizer.
Pruna AI org

This happens because the compressed model should use the tokenizer of the base model. For convenience, we added the tokenizer of the base model to this repo so that you should now be able to use AutoTokenizer.from_pretrained("PrunaAI/cognitivecomputations-dolphin-2.9-llama3-8b-256k-AWQ-4bit-smashed"). We will propagate this change to other relevant LLM repos.

sharpenb changed discussion status to closed

Sign up or log in to comment