Danish-Swedish Merged Model
This is a merge of the following models, all based on mistralai/Mistral-7B-v0.1
:
danish-foundation-models/munin-7b-alpha
, continued pretraining on Danish data;timpal0l/Mistral-7B-v0.1-flashback-v2
, continued pretraining on Swedish data.
Model Details
- Merged by: Dan Saattrup Nielsen
- Model type: Decoder model, based on
mistralai/Mistral-7B-v0.1
- Language(s): Danish and Swedish
- License: CC-BY-4.0
- Merge configuration:
dict( models=[ dict( model="danish-foundation-models/munin-7b-alpha", parameters=dict( weight=1.0, density=0.6, ), ), dict( model="timpal0l/Mistral-7B-v0.1-flashback-v2", parameters=dict( weight=1.0, density=0.6, ), ), ], merge_method="dare_ties", random_seed=4242 base_model="mistralai/Mistral-7B-v0.1", parameters=dict( int8_mask=True, normalize=True, ), dtype="bfloat16", )
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for merge-crew/da-sv-dare-ties-density-0.6
Merge model
this model