--- base_model: - jeiku/RocketHermesZephyrBoros_3B - jeiku/Erotica_StableLM - jeiku/RocketHermesZephyrBoros_3B - jeiku/No_Robots_Alpaca_StableLM - jeiku/RocketHermesZephyrBoros_3B - jeiku/Toxic_DPO_StableLM - jeiku/RocketHermesZephyrBoros_3B - jeiku/Everything_v3_128_StableLM - jeiku/RocketHermesZephyrBoros_3B - jeiku/Bluemoon_cleaned_StableLM - jeiku/RocketHermesZephyrBoros_3B - jeiku/RocketHermesZephyrBoros_3B - jeiku/Gnosis_StableLM - jeiku/RocketHermesZephyrBoros_3B - jeiku/Theory_of_Mind_RP_128_StableLM tags: - mergekit - merge license: other datasets: - AdamCodd/no_robots-alpaca - diffnamehard/toxic-dpo-v0.1-NoWarning-alpaca - totally-not-an-llm/EverythingLM-data-V3 - Squish42/bluemoon-fandom-1-1-rp-cleaned - FriezaForce/unranked_theory_of_mind_roleplay language: - en --- # Mixed This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) as a base. ### Models Merged The following models were included in the merge: * [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) + [jeiku/Erotica_StableLM](https://huggingface.co/jeiku/Erotica_StableLM) * [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) + [jeiku/No_Robots_Alpaca_StableLM](https://huggingface.co/jeiku/No_Robots_Alpaca_StableLM) * [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) + [jeiku/Toxic_DPO_StableLM](https://huggingface.co/jeiku/Toxic_DPO_StableLM) * [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) + [jeiku/Everything_v3_128_StableLM](https://huggingface.co/jeiku/Everything_v3_128_StableLM) * [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) + [jeiku/Bluemoon_cleaned_StableLM](https://huggingface.co/jeiku/Bluemoon_cleaned_StableLM) * [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) + [jeiku/Gnosis_StableLM](https://huggingface.co/jeiku/Gnosis_StableLM) * [jeiku/RocketHermesZephyrBoros_3B](https://huggingface.co/jeiku/RocketHermesZephyrBoros_3B) + [jeiku/Theory_of_Mind_RP_128_StableLM](https://huggingface.co/jeiku/Theory_of_Mind_RP_128_StableLM) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: jeiku/RocketHermesZephyrBoros_3B+jeiku/Bluemoon_cleaned_StableLM parameters: weight: 0.30 density: 0.25 - model: jeiku/RocketHermesZephyrBoros_3B+jeiku/Toxic_DPO_StableLM parameters: weight: 0.25 density: 0.25 - model: jeiku/RocketHermesZephyrBoros_3B+jeiku/Theory_of_Mind_RP_128_StableLM parameters: weight: 0.35 density: 0.25 - model: jeiku/RocketHermesZephyrBoros_3B+jeiku/No_Robots_Alpaca_StableLM parameters: weight: 0.25 density: 0.25 - model: jeiku/RocketHermesZephyrBoros_3B+jeiku/Everything_v3_128_StableLM parameters: weight: 0.5 density: 0.5 - model: jeiku/RocketHermesZephyrBoros_3B+jeiku/Gnosis_StableLM parameters: weight: 0.4 density: 0.4 - model: jeiku/RocketHermesZephyrBoros_3B+jeiku/Erotica_StableLM parameters: weight: 0.20 density: 0.20 merge_method: dare_ties base_model: jeiku/RocketHermesZephyrBoros_3B parameters: dtype: bfloat16 ```