How to run it locally in Visual Code
Hi, how I can run it locally? Is somewhere some guidence? Thx
git clone https://github.com/oobabooga/text-generation-webui
cd text-generation-webui
.\start_windows.bat
#follow the installation
http://127.0.0.1:7860/
#click on Model and in' Download model or LoRA' field paste
WhiteRabbitNeo/WhiteRabbitNeo-13B-v1
#click on Download button and follow the progress via terminal
can you help me pls I have this error:
File "/home/gqwe/text-gen-install/text-generation-webui/modules/ui_model_menu.py", line 275, in download_model_wrapper
links, sha256, is_lora, is_llamacpp = downloader.get_download_links_from_huggingface(model, branch, text_only=False, specific_file=specific_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/gqwe/text-gen-install/text-generation-webui/download-model.py", line 88, in get_download_links_from_huggingface
r.raise_for_status()
File "/home/gqwe/text-gen-install/text-generation-webui/installer_files/env/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/models/WhiteRabbitNeo/WhiteRabbitNeo-13B-v1/tree/main
The model is now gated, so you'd probably need to log in with huggingface cli. Search on Google please.
Can you try accepting the model on the website? It’s gated now. Let me know if you’re still having issues!
Well i had that problem, for me it was because i used sudo under installation of the AI.
I have another problem now, maybe someone can help me:
ValueError: The current device_map
had weights offloaded to the disk. Please provide an offload_folder
for them. Alternatively, make
sure you have safetensors
installed if the model you are using offers the weights in this format.
How do i set an offload_folder? i have tryed:
offload_folder = "~/AI/text-generation-webui/offload",
And it doesn't work for me, anyone there can help?
sure you have
safetensors
installed if the model you are using offers the weights in this format.How do i set an offload_folder? i have tryed:
offload_folder = "~/AI/text-generation-webui/offload",
And it doesn't work for me, anyone there can help?
Your pc is running low on available RAM for GPU and OS RAM to load the entire AI model at once.
'pip install safetensors '
who is the owner of 'offload' folder? how about folder permission (chmod 755 offload) or simply remove the folder using 'sudo rm -rf offload' and make another one without using sudo
instead of ~/ use full path /home/mannedk/Ai/text-generation-webui/offload