--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - LiteAI/Hare-1.1B-Chat - cognitivecomputations/TinyDolphin-2.8-1.1b - D1rtyB1rd/Dirty-Alice-Tiny-1.1B-v1 --- # Dirty-Hairy-Llolphin-1.1B-v2 Dirty-Hairy-Llolphin-1.1B-v2 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [LiteAI/Hare-1.1B-Chat](https://huggingface.co/LiteAI/Hare-1.1B-Chat) * [cognitivecomputations/TinyDolphin-2.8-1.1b](https://huggingface.co/cognitivecomputations/TinyDolphin-2.8-1.1b) * [D1rtyB1rd/Dirty-Alice-Tiny-1.1B-v1](https://huggingface.co/D1rtyB1rd/Dirty-Alice-Tiny-1.1B-v1) ## 🧩 Configuration ```yaml models: - model: LiteAI/Hare-1.1B-Chat parameters: density: 0.5 weight: 0.5 - model: cognitivecomputations/TinyDolphin-2.8-1.1b parameters: density: 0.5 weight: 0.5 - model: D1rtyB1rd/Dirty-Alice-Tiny-1.1B-v1 parameters: density: 0.5 weight: 0.5 merge_method: ties base_model: TinyLlama/TinyLlama-1.1B-intermediate-step-480k-1T parameters: normalize: true int8_mask: true dtype: float16 ```