--- base_model: - ajibawa-2023/General-Stories-Mistral-7B - OmnicromsBrain/Eros_Scribe-7b - NousResearch/Yarn-Mistral-7b-64k - MrRobotoAI/Thoth-6delta - Local-Novel-LLM-project/Ninja-v1-NSFW - BlueNipples/Apocrypha-7b library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [NousResearch/Yarn-Mistral-7b-64k](https://huggingface.co/NousResearch/Yarn-Mistral-7b-64k) as a base. ### Models Merged The following models were included in the merge: * [ajibawa-2023/General-Stories-Mistral-7B](https://huggingface.co/ajibawa-2023/General-Stories-Mistral-7B) * [OmnicromsBrain/Eros_Scribe-7b](https://huggingface.co/OmnicromsBrain/Eros_Scribe-7b) * [MrRobotoAI/Thoth-6delta](https://huggingface.co/MrRobotoAI/Thoth-6delta) * [Local-Novel-LLM-project/Ninja-v1-NSFW](https://huggingface.co/Local-Novel-LLM-project/Ninja-v1-NSFW) * [BlueNipples/Apocrypha-7b](https://huggingface.co/BlueNipples/Apocrypha-7b) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: NousResearch/Yarn-Mistral-7b-64k parameters: weight: 0.1 density: 0.8 - model: MrRobotoAI/Thoth-6delta parameters: weight: 0.6 density: 0.8 - model: Local-Novel-LLM-project/Ninja-v1-NSFW parameters: weight: 0.05 density: 0.8 - model: OmnicromsBrain/Eros_Scribe-7b parameters: weight: 0.15 density: 0.8 - model: ajibawa-2023/General-Stories-Mistral-7B parameters: weight: 0.05 density: 0.8 - model: BlueNipples/Apocrypha-7b parameters: weight: 0.05 density: 0.8 merge_method: ties base_model: NousResearch/Yarn-Mistral-7b-64k parameters: normalize: true int8_mask: true dtype: float16 ```