WavGPT-1.5-GGUF
Quickstart
Check out our llama.cpp documentation for more usage guide.
We advise you to clone llama.cpp
and install it following the official guide. We follow the latest version of llama.cpp.
In the following demonstration, we assume that you are running commands under the repository llama.cpp
.
Since cloning the entire repo may be inefficient, you can manually download the GGUF file that you need or use huggingface-cli
:
- Install
pip install -U huggingface_hub
- Download:
huggingface-cli download Hack337/WavGPT-1.5-GGUF WavGPT-1.5.gguf --local-dir . --local-dir-use-symlinks False
For users, to achieve chatbot-like experience, it is recommended to commence in the conversation mode:
./llama-cli -m <gguf-file-path> \
-co -cnv -p "Вы очень полезный помощник." \
-fa -ngl 80 -n 512
- Downloads last month
- 27