Can't load the model on ooba

#1
by Nicopara - opened

Traceback (most recent call last): File “server.py”, line 70, in load_model_wrapper shared.model, shared.tokenizer = load_model(shared.model_name) File “/home/rocminsanity/Desktop/machinelearning/text-generation-webui-main/modules/models.py”, line 103, in load_model tokenizer = load_tokenizer(model_name, model) File “/home/rocminsanity/Desktop/machinelearning/text-generation-webui-main/modules/models.py”, line 128, in load_tokenizer tokenizer = LlamaTokenizer.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}/"), clean_up_tokenization_spaces=True) File “/home/rocminsanity/miniconda3/envs/py3k/lib/python3.8/site-packages/transformers/tokenization_utils_base.py”, line 1796, in from_pretrained raise EnvironmentError( OSError: Can’t load tokenizer for ‘models/baize-v2-13b-GPTQ’. If you were trying to load it from ‘https://huggingface.co/models’, make sure you don’t have a local directory with the same name. Otherwise, make sure ‘models/baize-v2-13b-GPTQ’ is the correct path to a directory containing all relevant files for a LlamaTokenizer tokenizer.

One moment, I am checking

I just double checked in ooba and had no problems at all, following the instructions in the README.

Please try the ooba model download again in case you are missing some files. It won't re-download any file unless it's corrupted or missing.

Oh, looking at the log again I think you probably downloaded manually? Otherwise the folder should be called models/TheBloke_Project-Baize-v2-13B-GPTQ - so yeah you must have missed some files or something. Or you started to download while I was still uploading, not sure.

Anyway, repeat download and make sure all files are present.

I downloaded using git clone, but It didn't catch all files. Thank you.

Nicopara changed discussion status to closed

No worries. For the future, I recommend using the text-generation-webui downloader, which downloads over HTTP. You can either use it from the text-gen UI (bottom of Models page), or on the command line like this:

mkdir models
python /path/to/text-generation-webui/download-model.py  TheBloke/Project-Baize-v2-13B-GPTQ  --thread 2

When you use git clone, it stores every file twice, taking longer to complete and using double the disk space. Also it doesn't give very useful progress indication, like the true download speed in MB/s.

I find it much quicker and easier to use than git.

Sign up or log in to comment