Olethros-8B-AWQ
L3-8b-Instruct tuned on roughly 6000 Opus generations in the hopes of adding a bit of sovl.
This is the 4 bit, group size 128 AWQ quant.
original model weights are here.
Quants
Exl2 and AWQ available right now.
Type | Misc | Author |
---|---|---|
GGUF | Static GGUF Quants | mradermacher |
AWQ | lodrick | |
exl2 | 2.25bpw | blockblockblock |
exl2 | 2.5bpw | blockblockblock |
exl2 | 3.0bpw | blockblockblock |
exl2 | 3.5bpw | blockblockblock |
exl2 | 3.7bpw | blockblockblock |
exl2 | 4.0bpw | blockblockblock |
exl2 | 4.2bpw | blockblockblock |
exl2 | 4.4bpw | blockblockblock |
exl2 | 4.6bpw | blockblockblock |
exl2 | 4.8bpw | blockblockblock |
exl2 | 5.0bpw | blockblockblock |
exl2 | 5.5bpw | blockblockblock |
exl2 | 6.0bpw | blockblockblock |
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.