Danish-Swedish Merged Model
This is a merge of the following models, all based on mistralai/Mistral-7B-v0.1
:
danish-foundation-models/munin-7b-alpha
, continued pretraining on Danish data;timpal0l/Mistral-7B-v0.1-flashback-v2
, continued pretraining on Swedish data.
Model Details
- Merged by: Dan Saattrup Nielsen
- Model type: Decoder model, based on
mistralai/Mistral-7B-v0.1
- Language(s): Danish and Swedish
- License: CC-BY-4.0
- Merge configuration:
dict( models=[ dict( model="danish-foundation-models/munin-7b-alpha", parameters=dict( weight=1.0, ), ), dict( model="timpal0l/Mistral-7B-v0.1-flashback-v2", parameters=dict( weight=1.0, ), ), ], merge_method="slerp", base_model="danish-foundation-models/munin-7b-alpha", parameters=dict( int8_mask=True, normalize=True, t=0.5, ), dtype="bfloat16", )
- Downloads last month
- 11
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for merge-crew/da-sv-slerp
Merge model
this model