State-of-the-art Danish Models
Collection
These models constitute state-of-the-art models for Danish within their respective domain (highlighted below the model).
•
13 items
•
Updated
•
10
The Munin 7B Alpha Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters, based on Mistral-7B-v0.1.
It has been trained on Danish Gigaword using continual pretraining.
For full details of this model please read our release blog post. The code-base can be found on our Git repo.
Note: This model is an Alpha model. We don't recommend using this model in production. If you do use the model, please let us know.
Munin 7B Alpha is, like Mistral 7B, a pretrained base model and therefore does not have any moderation mechanisms.
The model is developed by the Danish Foundation Models Team
Base model
mistralai/Mistral-7B-v0.1