metadata
base_model:
- ycros/BagelMIsteryTour-v2-8x7B
- NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
- smelborp/MixtralOrochi8x7B
tags:
- mergekit
- merge
maid-yuzu-v3
This is a merge of pre-trained language models created using mergekit.
This model was created because I wanted to know how the density and weight values of the dare_ties method affect the base model.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using smelborp/MixtralOrochi8x7B as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
base_model:
model:
path: smelborp/MixtralOrochi8x7B
dtype: bfloat16
merge_method: dare_ties
slices:
- sources:
- layer_range: [0, 32]
model:
model:
path: ycros/BagelMIsteryTour-v2-8x7B
parameters:
density: 0.6
weight: 0.25
- layer_range: [0, 32]
model:
model:
path: NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
parameters:
density: 0.4
weight: 0.1
- layer_range: [0, 32]
model:
model:
path: smelborp/MixtralOrochi8x7B