11-20B Models
Collection
Medium model merges
β’
5 items
β’
Updated
β’
2
RP Model, Solar. Higher density+LimaRP led to better performance, Use Alpaca/Vicuna.
This is a merge of pre-trained language models created using mergekit.
This model was merged using the DARE TIES merge method using Sao10K/Fimbulvetr-11B-v2 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: Himitsui/MedMitsu-Instruct-11B
parameters:
weight: 0.13
density: 0.60
- model: Himitsui/Kaiju-11B
parameters:
weight: 0.22
density: 0.73
- model: migtissera/Synthia-v3.0-11B+jeiku/Re-Host_Limarp_Mistral
parameters:
weight: 0.28
density: 0.80
- model: TheDrummer/Moistral-11B-v3
parameters:
weight: 0.37
density: 0.85
merge_method: dare_ties
base_model: Sao10K/Fimbulvetr-11B-v2
parameters:
int8_mask: true
dtype: bfloat16