HTTPError: 404
HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/LoneStriker/FusionNet_7Bx2_MoE_14B-4.0bpw-h6-exl2/resolve/main/model-00001-of-00003.safetensors
The above exception was the direct cause of the following exception:
EntryNotFoundError Traceback (most recent call last)
EntryNotFoundError: 404 Client Error. (Request ID: Root=...)
Entry Not Found for url: https://huggingface.co/LoneStriker/FusionNet_7Bx2_MoE_14B-4.0bpw-h6-exl2/resolve/main/model-00001-of-00003.safetensors.
Download this model using ooba, git
or the huggingface-cli
command. I have no idea what you're running and why it's getting that error.
FYI, I was using https://github.com/huggingface/transformers...
FYI, I was using https://github.com/huggingface/transformers...
You can't use transformers to load exl2-quantized models. You have to use exllamav2. You can use the project directly or use ooba's text-generation-webui to load these models..
Thanks, I was able to load it in colab with exllamav2 after all. Now I just have to set it up with ooba's.