--- license: cc-by-nc-4.0 tags: - merge - conversational - multi-task pipeline_tag: text-generation --- # Winter Garden 7B - δ - "Charming" It was mentioned that we are in the open ai dark winter; so I thought I would make myself a nice winter garden. ## An experiment I performed the same type of merge as in the previous model, but with a different set of models. I took the following models: * Mistral-7B-v0.1 and merged in * KuNoichi-DPO-v2-7B * Datura_7B * AlphaMonarch-7B * LemonadeRP-4.5.3 * Prima-LelantaclesV6-7b * FuseChat-7B-VaRM * Capricorn-7B-DPO * eros-7b-test * NeuralMarcoro14-7B * StrangeMerges_6-7B-dare_ties * Multi-Verse-RP-7B * WestLake-7B-v2-laser-truthy-dpo * Noromaid-7B-0.4-DPO * Thespis-Balanced-7b-v1 * InfinityRP-v1-7B * winter-garden-7b-gamma in an iterative DARE-TIES tree merge, ordering the merge order by tensor-relative cosine similarity until the merge branches resolve to a single value. ## Chat Template These models were selected because they follow my chat template, which is '' ended turns. A lot of models follow this template by default because they were trained with end padding, so this is a natural choice for chat, and should be highly compatible with ST. ``` Tom: Hello, how are you? Jane: I am fine, thank you. ``` ## Why? The purpose of all of these models is to act as a base for me to train on. This one so far has the best multi-turn conversational ability, and should get really good at following long-form conversations after a bit of tweaking.