I can't get it running in text-generation-webui
I tried to follow all the instructions carefully and I download the model using text-generation-webui's "Model" tab, still I get the follow error:
Traceback (most recent call last):
File "/mnt/models/text-generation-webui/modules/ui_model_menu.py", line 237, in download_model_wrapper
model, branch = downloader.sanitize_model_and_branch_names(repo_id, None)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/models/text-generation-webui/download-model.py", line 39, in sanitize_model_and_branch_names
if model[-1] == '/':
~~~~~^^^^
IndexError: string index out of range
Any thoughts?
same here
Getting this error:
23:25:45-094355 INFO Loading TheBloke_Mixtral-8x7B-Instruct-v0.1-GPTQ
23:25:45-609266 ERROR Failed to load the model.
Traceback (most recent call last):
File "E:\text-generation-webui-main\text-generation-webui-main\modules\ui_model_menu.py", line 214, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\modules\models.py", line 90, in load_model
output = load_func_maploader
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\modules\models.py", line 399, in ExLlama_HF_loader
return ExllamaHF.from_pretrained(model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\modules\exllama_hf.py", line 174, in from_pretrained
return ExllamaHF(config)
^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\modules\exllama_hf.py", line 31, in init
self.ex_model = ExLlama(self.ex_config)
^^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\exllama\model.py", line 753, in init
decoder_size += math.prod(shape) * _layer_dtype_size(key)
^^^^^^^^^^^^^^^^^^^^^^
File "E:\text-generation-webui-main\text-generation-webui-main\installer_files\env\Lib\site-packages\exllama\model.py", line 716, in _layer_dtype_size
raise ValueError("Unrecognized layer: " + key)
ValueError: Unrecognized layer: model.layers.0.block_sparse_moe.experts.0.w1.bias
ERROR Failed to load the model.
Traceback (most recent call last):
File "D:\text-generation-webui-main\modules\ui_model_menu.py", line 213, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui-main\modules\models.py", line 87, in load_model
output = load_func_maploader
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui-main\modules\models.py", line 250, in llamacpp_loader
model, tokenizer = LlamaCppModel.from_pretrained(model_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui-main\modules\llamacpp_model.py", line 63, in from_pretrained
Llama = llama_cpp_lib().Llama
^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'Llama'