File size: 1,529 Bytes
2c77f94
 
3d54108
 
 
 
 
2c77f94
3d54108
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
license: cc-by-nc-4.0
tags:
- merge
- conversational
- multi-task
pipeline_tag: text-generation
---

# Winter Garden 7B - δ - "Charming"

It was mentioned that we are in the open ai dark winter; so I thought I would make myself a nice winter garden.

## An experiment

I performed the same type of merge as in the previous model, but with a different set of models.  I took the following models:

* Mistral-7B-v0.1

and merged in

* KuNoichi-DPO-v2-7B
* Datura_7B
* AlphaMonarch-7B
* LemonadeRP-4.5.3
* Prima-LelantaclesV6-7b
* FuseChat-7B-VaRM
* Capricorn-7B-DPO
* eros-7b-test
* NeuralMarcoro14-7B
* StrangeMerges_6-7B-dare_ties
* Multi-Verse-RP-7B
* WestLake-7B-v2-laser-truthy-dpo
* Noromaid-7B-0.4-DPO
* Thespis-Balanced-7b-v1
* InfinityRP-v1-7B
* winter-garden-7b-gamma

in an iterative DARE-TIES tree merge, ordering the merge order by tensor-relative cosine similarity until the merge branches resolve to a single value.
 
## Chat Template

These models were selected because they follow my chat template, which is '</s>' ended turns.  A lot of models follow this template by default because they were trained with end padding, so this is a natural choice for chat, and should be highly compatible with ST.

```
Tom: Hello, how are you?</s>
Jane: I am fine, thank you.</s>
```

## Why?

The purpose of all of these models is to act as a base for me to train on.  This one so far has the best multi-turn conversational ability, and should get really good at following long-form conversations after a bit of tweaking.