--- base_model: - Delta-Vector/Odin-9B - anthracite-org/magnum-v3-9b-chatml library_name: transformers pipeline_tag: text-generation tags: - mergekit - merge license: gemma --- # Magnolia-v1-Gemma2-8k-9B This repo contains a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). Text generation length variable. Creativity is on the wild side. Attention could use improvement. Model safety questionable. I'm uploading anyway, as this model has potential value as a merge contribution. ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [Delta-Vector/Odin-9B](https://huggingface.co/Delta-Vector/Odin-9B) * [anthracite-org/magnum-v3-9b-chatml](https://huggingface.co/anthracite-org/magnum-v3-9b-chatml) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Delta-Vector/Odin-9B - model: anthracite-org/magnum-v3-9b-chatml merge_method: slerp base_model: Delta-Vector/Odin-9B parameters: t: - value: 0.4 dtype: bfloat16 ```