Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ These quants take a while to do so please leave a like or a comment on the repo
|
|
11 |
|
12 |
EXL2: [3.5bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3.5bpw-exl2), [3bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3bpw-exl2), [2.4bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-2.4bpw-exl2)
|
13 |
|
14 |
-
[2bit Imatrix GGUF](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF): [IQ2-XS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-IQ2_XS.gguf), [IQ2-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-IQ2_XXS.gguf), [IQ3-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-IQ3_XXS.gguf)
|
15 |
|
16 |
### Custom format:
|
17 |
```
|
|
|
11 |
|
12 |
EXL2: [3.5bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3.5bpw-exl2), [3bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3bpw-exl2), [2.4bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-2.4bpw-exl2)
|
13 |
|
14 |
+
[2bit Imatrix GGUF](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF): [IQ2-XS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ2_XS.gguf), [IQ2-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ2_XXS.gguf), [IQ3-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ3_XXS.gguf)
|
15 |
|
16 |
### Custom format:
|
17 |
```
|