Update README.md
Browse files
README.md
CHANGED
@@ -39,10 +39,10 @@ quantized_by: TheBloke
|
|
39 |
This repo contains GGML format model files for [Stability AI's StableBeluga 2](https://huggingface.co/stabilityai/StableBeluga2).
|
40 |
|
41 |
These 70B Llama 2 GGML files currently only support CPU inference. They are known to work with:
|
42 |
-
* [llama.cpp](https://github.com/ggerganov/llama.cpp)
|
43 |
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most popular web UI.
|
44 |
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), version 1.37 and later. A powerful GGML web UI, especially good for story telling.
|
45 |
-
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), version 0.1.77 and later. A Python library with
|
46 |
|
47 |
## Repositories available
|
48 |
|
|
|
39 |
This repo contains GGML format model files for [Stability AI's StableBeluga 2](https://huggingface.co/stabilityai/StableBeluga2).
|
40 |
|
41 |
These 70B Llama 2 GGML files currently only support CPU inference. They are known to work with:
|
42 |
+
* [llama.cpp](https://github.com/ggerganov/llama.cpp), commit `e76d630` and later.
|
43 |
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most popular web UI.
|
44 |
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), version 1.37 and later. A powerful GGML web UI, especially good for story telling.
|
45 |
+
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), version 0.1.77 and later. A Python library with LangChain support, and OpenAI-compatible API server.
|
46 |
|
47 |
## Repositories available
|
48 |
|