--- base_model: - Saxo/Linkbricks-Horizon-AI-Korean-llama3-sft-dpo-8b-base - tesser-ai/Tesser-Llama-3-Ko-8B - NousResearch/Hermes-3-Llama-3.1-8B - meta-llama/Meta-Llama-3.1-8B - maum-ai/Llama-3-MAAL-8B-Instruct-v0.1 - Saxo/Linkbricks-Horizon-AI-Korean-llama-3.1-sft-dpo-8B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [Saxo/Linkbricks-Horizon-AI-Korean-llama-3.1-sft-dpo-8B](https://huggingface.co/Saxo/Linkbricks-Horizon-AI-Korean-llama-3.1-sft-dpo-8B) as a base. ### Models Merged The following models were included in the merge: * [Saxo/Linkbricks-Horizon-AI-Korean-llama3-sft-dpo-8b-base](https://huggingface.co/Saxo/Linkbricks-Horizon-AI-Korean-llama3-sft-dpo-8b-base) * [tesser-ai/Tesser-Llama-3-Ko-8B](https://huggingface.co/tesser-ai/Tesser-Llama-3-Ko-8B) * [NousResearch/Hermes-3-Llama-3.1-8B](https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B) * [meta-llama/Meta-Llama-3.1-8B](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B) * [maum-ai/Llama-3-MAAL-8B-Instruct-v0.1](https://huggingface.co/maum-ai/Llama-3-MAAL-8B-Instruct-v0.1) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: tesser-ai/Tesser-Llama-3-Ko-8B layer_range: [0, 32] parameters: density: 0.5 weight: 0.45 - model: maum-ai/Llama-3-MAAL-8B-Instruct-v0.1 layer_range: [0, 32] parameters: density: 0.5 weight: 0.45 - model: meta-llama/Meta-Llama-3.1-8B layer_range: [0, 32] parameters: density: 0.5 weight: 0.45 - model: NousResearch/Hermes-3-Llama-3.1-8B layer_range: [0, 32] parameters: density: 0.5 weight: 0.45 - model: Saxo/Linkbricks-Horizon-AI-Korean-llama3-sft-dpo-8b-base layer_range: [0, 32] parameters: density: 0.5 weight: 0.45 - model: Saxo/Linkbricks-Horizon-AI-Korean-llama-3.1-sft-dpo-8B layer_range: [0, 32] parameters: density: 0.5 weight: 0.45 merge_method: dare_ties base_model: Saxo/Linkbricks-Horizon-AI-Korean-llama-3.1-sft-dpo-8B dtype: bfloat16 ```