Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24-4Bit-GPTQ
- Original Model: Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24
Quantization
- This model was quantized with the Auto-GPTQ library and dataset containing english and russian wikipedia articles. It has lower perplexity on russian data then other GPTQ models.
- Downloads last month
- 876
Model tree for qilowoq/Vikhr-Nemo-12B-Instruct-R-21-09-24-4Bit-GPTQ
Base model
mistralai/Mistral-Nemo-Base-2407
Finetuned
mistralai/Mistral-Nemo-Instruct-2407