tags: | |
- llamafile | |
- GGUF | |
base_model: TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF | |
## TinyLlama-1.1B-Chat-v1.0-llamafile | |
llamafile lets you distribute and run LLMs with a single file. [announcement blog post](https://hacks.mozilla.org/2023/11/introducing-llamafile/) | |
#### Downloads | |
- [tinyllama-1.1b-chat-v1.0.Q3_K_M-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q3_K_M-server.llamafile) | |
- [tinyllama-1.1b-chat-v1.0.Q4_K_M-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q4_K_M-server.llamafile) | |
- [tinyllama-1.1b-chat-v1.0.Q5_0-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q5_0-server.llamafile) | |
- [tinyllama-1.1b-chat-v1.0.Q5_K_M-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q5_K_M-server.llamafile) | |
- [tinyllama-1.1b-chat-v1.0.Q8_0-server.llamafile](https://huggingface.co/rabil/TinyLlama-1.1B-Chat-v1.0-llamafile/resolve/main/tinyllama-1.1b-chat-v1.0.Q8_0-server.llamafile) | |
This repository was created using the [llamafile-builder](https://github.com/rabilrbl/llamafile-builder) | |