--- base_model: - automerger/YamshadowExperiment28-7B - CultriX/NeuralTrix-bf16 - CultriX/MonaTrix-v4 - CultriX/MonaCeption-7B-SLERP-DPO library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [automerger/YamshadowExperiment28-7B](https://huggingface.co/automerger/YamshadowExperiment28-7B) as a base. ### Models Merged The following models were included in the merge: * [CultriX/NeuralTrix-bf16](https://huggingface.co/CultriX/NeuralTrix-bf16) * [CultriX/MonaTrix-v4](https://huggingface.co/CultriX/MonaTrix-v4) * [CultriX/MonaCeption-7B-SLERP-DPO](https://huggingface.co/CultriX/MonaCeption-7B-SLERP-DPO) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: CultriX/MonaCeption-7B-SLERP-DPO - model: CultriX/NeuralTrix-bf16 - model: CultriX/MonaTrix-v4 merge_method: model_stock base_model: automerger/YamshadowExperiment28-7B dtype: bfloat16 ```