File size: 1,268 Bytes
72e6686 fc0c0d3 1747456 3fd97f5 72e6686 3fd97f5 72e6686 22463d9 72e6686 1747456 72e6686 3fd97f5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
# MiquMaid-v1-70B IQ2
## Description
2bit imatrix GGUF quants of [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B)
[Imatrix](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/Imatrix/imatrix-MiquMaid-c2000-ctx500-wikitext.dat) generated from q8 of MiquMaid, 2000 chunks at 500 ctx. Dataset was Wikitext.
These quants take a while to do so please leave a like or a comment on the repo so that i know if there is interest.
## Other quants:
EXL2: [3.5bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3.5bpw-exl2), [3bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3bpw-exl2), [2.4bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-2.4bpw-exl2)
[2bit Imatrix GGUF](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF): [IQ2-XS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ2_XS.gguf), [IQ2-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ2_XXS.gguf), [IQ3-XXS](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/MiquMaid-v1-70B-IQ3_XXS.gguf)
### Custom format:
```
### Instruction:
{system prompt}
### Input:
{input}
### Response:
{reply}
```
## Contact
Kooten on discord
[ko-fi.com/kooten](https://ko-fi.com/kooten) |