File size: 492 Bytes
9be88ce c9ff5f4 9be88ce |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
---
license: apache-2.0
pipeline_tag: text-generation
language:
- sv
- en
tags:
- pretrained
widget:
- text: "Jag tycker att det är roligt med"
---
# Model Card for Mistral-7B-v0.1-flashback
Mistral-7B-v0.1-flashback model is a continuation of the pretraining process for the base Mistral-7B-v0.1 model, utilizing around 40GB of forum threads from the Swedish website flashback.org.
The training was done with the QLoRa method, training about 88 million parameters in a single epoch.
|