metadata
library_name: transformers
inference:
parameters:
temperature: 1
top_p: 0.95
top_k: 40
repetition_penalty: 1.2
license: apache-2.0
language:
- en
pipeline_tag: text-generation
QuantFactory/Ministral-3b-instruct-GGUF
This is quantized version of ministral/Ministral-3b-instruct created using llama.cpp
Original Model Card
Model Description
Ministral is a series of language model, build with same architecture as the famous Mistral model, but with less size.
- Model type: A 3B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- Language(s) (NLP): Primarily English
- License: Apache 2.0
- Finetuned from model: mistralai/Mistral-7B-v0.1