Mistral-Ita-7b-GGUF / README.md
DeepMount00's picture
Update README.md
869f28d verified
|
raw
history blame
675 Bytes
---
license: apache-2.0
language:
- it
---
# Mistral-Ita-7B GGUF
## How to Use
How to utilize my Mistral for Italian text generation
```python
from ctransformers import AutoModelForCausalLM
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = AutoModelForCausalLM.from_pretrained("DeepMount00/Mistral-Ita-7b-GGUF", model_file="mistral_ita-7b-Q4_K_M.gguf", model_type="mistral", context_length=4096, max_new_tokens=1000, gpu_layers=20)
prompt = "Trova la soluzione a questo problema: se Mario ha 12 mele e ne vende 4 a 8 euro e le restanti a 3 euro, quanto guadagna Mario?"
print(llm(prompt))
```