knifeayumu's picture
Update README.md
289677b verified
---
base_model:
- TheDrummer/Cydonia-22B-v1.2
- anthracite-org/magnum-v4-22b
library_name: transformers
tags:
- mergekit
- merge
license: other
license_name: mrl
inference: false
license_link: https://mistral.ai/licenses/MRL-0.1.md
---
![Too Horny](Magnum-v4-Cydonia-v1.2-22B.png)
# Magnum? More like Deagle (dies in cringe)
[Cydonia-v1.2-Magnum-v4-22B](https://huggingface.co/knifeayumu/Cydonia-v1.2-Magnum-v4-22B) but inverse... Some prefer [anthracite-org/magnum-v4-22b](https://huggingface.co/anthracite-org/magnum-v4-22b) over [TheDrummer/Cydonia-22B-v1.2](https://huggingface.co/TheDrummer/Cydonia-22B-v1.2) so this merge is born.
This is a merge of pre-trained language models created using [mergekit](https://github.com/arcee-ai/mergekit).
## Merge Details
### Merge Method
This model was merged using the SLERP merge method.
### Models Merged
The following models were included in the merge:
* [TheDrummer/Cydonia-22B-v1.2](https://huggingface.co/TheDrummer/Cydonia-22B-v1.2)
* [anthracite-org/magnum-v4-22b](https://huggingface.co/anthracite-org/magnum-v4-22b)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: anthracite-org/magnum-v4-22b
- model: TheDrummer/Cydonia-22B-v1.2
merge_method: slerp
base_model: anthracite-org/magnum-v4-22b
parameters:
t: [0.1, 0.3, 0.6, 0.3, 0.1]
dtype: bfloat16
```