Update README.md
#2
by
m-i
- opened
README.md
CHANGED
@@ -9,7 +9,7 @@ extra_gated_description: If you want to learn more about how we process your per
|
|
9 |
|
10 |
# mlx-community/Mamba-Codestral-7B-v0.1-8bits
|
11 |
|
12 |
-
The Model [mlx-community/Mamba-Codestral-7B-v0.1-
|
13 |
|
14 |
## Use with mlx
|
15 |
|
@@ -20,7 +20,7 @@ pip install mlx-lm
|
|
20 |
```python
|
21 |
from mlx_lm import load, generate
|
22 |
|
23 |
-
model, tokenizer = load("mlx-community/Mamba-Codestral-7B-v0.1-
|
24 |
|
25 |
prompt="hello"
|
26 |
|
|
|
9 |
|
10 |
# mlx-community/Mamba-Codestral-7B-v0.1-8bits
|
11 |
|
12 |
+
The Model [mlx-community/Mamba-Codestral-7B-v0.1-8bit](https://huggingface.co/mlx-community/Mamba-Codestral-7B-v0.1-8bit) was converted to MLX format from [mistralai/Mamba-Codestral-7B-v0.1](https://huggingface.co/mistralai/Mamba-Codestral-7B-v0.1) using mlx-lm version **0.18.2**.
|
13 |
|
14 |
## Use with mlx
|
15 |
|
|
|
20 |
```python
|
21 |
from mlx_lm import load, generate
|
22 |
|
23 |
+
model, tokenizer = load("mlx-community/Mamba-Codestral-7B-v0.1-8bit")
|
24 |
|
25 |
prompt="hello"
|
26 |
|