base_model: [] | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
# final_merge | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using ./storage3/input_models/Mistral-7B-v0.1_8133861 as a base. | |
### Models Merged | |
The following models were included in the merge: | |
* ./storage3/input_models/WizardMath-7B-V1.1_2027605156 | |
* ./storage3/input_models/shisa-gamma-7b-v1_4025154171 | |
* ./storage3/input_models/Abel-7B-002_121690448 | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
base_model: ./storage3/input_models/Mistral-7B-v0.1_8133861 | |
dtype: bfloat16 | |
merge_method: dare_ties | |
parameters: | |
int8_mask: 1.0 | |
normalize: 1.0 | |
slices: | |
- sources: | |
- layer_range: [0, 32] | |
model: ./storage3/input_models/shisa-gamma-7b-v1_4025154171 | |
parameters: | |
density: 1.0 | |
weight: -0.0378726672672588 | |
- layer_range: [0, 32] | |
model: ./storage3/input_models/WizardMath-7B-V1.1_2027605156 | |
parameters: | |
density: 0.7433311818361178 | |
weight: 1.5192904356611323 | |
- layer_range: [0, 32] | |
model: ./storage3/input_models/Abel-7B-002_121690448 | |
parameters: | |
density: 0.47833652897680473 | |
weight: 1.0403117323704718 | |
- layer_range: [0, 32] | |
model: ./storage3/input_models/Mistral-7B-v0.1_8133861 | |
``` | |