|
--- |
|
license: bigscience-bloom-rail-1.0 |
|
language: |
|
- ak |
|
- ar |
|
- as |
|
- bm |
|
- bn |
|
- ca |
|
- code |
|
- en |
|
- es |
|
- eu |
|
- fon |
|
- fr |
|
- gu |
|
- hi |
|
- id |
|
- ig |
|
- ki |
|
- kn |
|
- lg |
|
- ln |
|
- ml |
|
- mr |
|
- ne |
|
- nso |
|
- ny |
|
- or |
|
- pa |
|
- pt |
|
- rn |
|
- rw |
|
- sn |
|
- st |
|
- sw |
|
- ta |
|
- te |
|
- tn |
|
- ts |
|
- tum |
|
- tw |
|
- ur |
|
- vi |
|
- wo |
|
- xh |
|
- yo |
|
- zh |
|
- zu |
|
programming_language: |
|
- C |
|
- C++ |
|
- C# |
|
- Go |
|
- Java |
|
- JavaScript |
|
- Lua |
|
- PHP |
|
- Python |
|
- Ruby |
|
- Rust |
|
- Scala |
|
- TypeScript |
|
tags: |
|
- llm-rs |
|
- ggml |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
# GGML covnerted Models of [BigScience](https://huggingface.co/bigscience)'s Bloom models |
|
|
|
## Description |
|
|
|
BLOOM is an autoregressive Large Language Model (LLM), trained to continue text from a prompt on vast amounts of text data using industrial-scale computational resources. As such, it is able to output coherent text in 46 languages and 13 programming languages that is hardly distinguishable from text written by humans. BLOOM can also be instructed to perform text tasks it hasn't been explicitly trained for, by casting them as text generation tasks. |
|
|
|
|
|
## Converted Models |
|
| Name | Based on | Type | Container | GGML Version | |
|
|:-------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------|:-------|:------------|:---------------| |
|
| [bloom-1b7-f16.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-1b7-f16.bin) | [bigscience/bloom-1b7](https://huggingface.co/bigscience/bloom-1b7) | F16 | GGML | V3 | |
|
| [bloom-1b7-q4_0.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-1b7-q4_0.bin) | [bigscience/bloom-1b7](https://huggingface.co/bigscience/bloom-1b7) | Q4_0 | GGML | V3 | |
|
| [bloom-1b7-q4_0-ggjt.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-1b7-q4_0-ggjt.bin) | [bigscience/bloom-1b7](https://huggingface.co/bigscience/bloom-1b7) | Q4_0 | GGJT | V3 | |
|
| [bloom-1b7-q5_1-ggjt.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-1b7-q5_1-ggjt.bin) | [bigscience/bloom-1b7](https://huggingface.co/bigscience/bloom-1b7) | Q5_1 | GGJT | V3 | |
|
| [bloom-3b-f16.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-3b-f16.bin) | [bigscience/bloom-3b](https://huggingface.co/bigscience/bloom-3b) | F16 | GGML | V3 | |
|
| [bloom-3b-q4_0.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-3b-q4_0.bin) | [bigscience/bloom-3b](https://huggingface.co/bigscience/bloom-3b) | Q4_0 | GGML | V3 | |
|
| [bloom-3b-q4_0-ggjt.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-3b-q4_0-ggjt.bin) | [bigscience/bloom-3b](https://huggingface.co/bigscience/bloom-3b) | Q4_0 | GGJT | V3 | |
|
| [bloom-3b-q5_1-ggjt.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-3b-q5_1-ggjt.bin) | [bigscience/bloom-3b](https://huggingface.co/bigscience/bloom-3b) | Q5_1 | GGJT | V3 | |
|
| [bloom-560m-f16.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-560m-f16.bin) | [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) | F16 | GGML | V3 | |
|
| [bloom-560m-q4_0.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-560m-q4_0.bin) | [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) | Q4_0 | GGML | V3 | |
|
| [bloom-560m-q4_0-ggjt.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-560m-q4_0-ggjt.bin) | [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) | Q4_0 | GGJT | V3 | |
|
| [bloom-560m-q5_1-ggjt.bin](https://huggingface.co/rustformers/bloom-ggml/blob/main/bloom-560m-q5_1-ggjt.bin) | [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) | Q5_1 | GGJT | V3 | |
|
|
|
## Usage |
|
|
|
### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python): |
|
|
|
#### Installation |
|
Via pip: `pip install llm-rs` |
|
|
|
#### Run inference |
|
```python |
|
from llm_rs import AutoModel |
|
|
|
#Load the model, define any model you like from the list above as the `model_file` |
|
model = AutoModel.from_pretrained("rustformers/bloom-ggml",model_file="bloom-3b-q4_0-ggjt.bin") |
|
|
|
#Generate |
|
print(model.generate("The meaning of life is")) |
|
``` |
|
|
|
### Rust via [Rustformers/llm](https://github.com/rustformers/llm): |
|
|
|
#### Installation |
|
``` |
|
git clone --recurse-submodules https://github.com/rustformers/llm.git |
|
cd llm |
|
cargo build --release |
|
``` |
|
|
|
#### Run inference |
|
``` |
|
cargo run --release -- bloom infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:" |
|
``` |