File size: 6,202 Bytes
14e36c1
 
 
 
06c5594
 
 
6349fcb
14e36c1
 
 
 
 
 
 
0b615fb
 
25ee72c
 
0b615fb
25ee72c
6f9a38f
0b615fb
25ee72c
 
 
aa58973
cd74925
 
 
14e36c1
 
 
 
 
 
 
 
31df305
 
89dfaef
b338373
9c78ce0
77f1f21
d68fc0b
77f1f21
d68fc0b
77f1f21
cae9862
b338373
31df305
 
d68fc0b
77f1f21
7d71db0
77f1f21
31df305
77f1f21
31df305
77f1f21
31df305
77f1f21
cae9862
77f1f21
45bfd7f
31df305
77f1f21
79b37dd
31df305
 
cae9862
31df305
 
eb84a93
d68fc0b
 
cae9862
 
d68fc0b
202038c
cae9862
d68fc0b
 
202038c
cae9862
 
 
ec3db0f
cae9862
 
 
14e36c1
 
 
 
 
 
 
 
 
 
 
 
 
2711d72
14e36c1
2711d72
14e36c1
2711d72
14e36c1
2711d72
14e36c1
2711d72
14e36c1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6c9e306
14e36c1
 
 
 
 
 
 
 
 
 
 
 
 
 
06c5594
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
---
license: other
license_name: yi-34b
license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE
tags:
- merge
- roleplay
- not-for-all-audiences
---


# Merged-Vicuna-RP-Stew-34B

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

exl2 versions can be found here:

4.65
https://huggingface.co/ParasiticRogue/Merged-Vicuna-RP-Stew-34B-exl2-4.65?not-for-all-audiences=true

4.25
https://huggingface.co/ParasiticRogue/Merged-Vicuna-RP-Stew-34B-4.25bpw-h6-exl2-fix?not-for-all-audiences=true

3.5
https://huggingface.co/ParasiticRogue/Merged-Vicuna-RP-Stew-34B-3.5bpw-h6-exl2?not-for-all-audiences=true

GGUFs provided by tachyphylaxis:

https://huggingface.co/tachyphylaxis/Merged-Vicuna-RP-Stew-34B-GGUF

## Merge Details

Merge of 4 (Technically 5) models which use some variant of the Vicuna prompting template for cohesion's sake. Besides being decent models, Capybara was chosen at a higher percentage for it's general aptitude plus preserving longer context length, Tess-1.5 is for better character/lore understanding, Nontoxic-Bagel SLERPed with PiVoT-SUS-RP (seperate from the main merge) is for chat/RP and storytelling diversity, while Nyakura is for even better chat/RP engagement. 

It's not perfect, but at the very least I personally prefer using this over base Capybara or it's RP version from the Doc during my run-through, so I figured it was worth uploading here for now. Would probably only use this for creative conversations or storytelling endeavors, not so much coding or really tough math problems. Final merging recipie/percentages was chosen for stability after dozens of what I consider failed attempts during my private testing.

Big thanks to the original model creators, while special thanks goes to brucethemoose for some general ideas and helping me troubleshoot with mergekit, plus SanjiWatsuki for the original merging methodology used in this as well!

### Settings

Temperature @ 0.88

Min-P @ 0.1

Repetition Penalty @ 1.07

Repetition Range @ 2048

Smoothing Factor @ 0.44

Everything else @ off

Early Stopping = X

Do Sample = ✓

Add BOS Token = X

Ban EOS Token = ✓

Skip Special Tokens = ✓

Temperature Last = X

Custom Stopping Strings: "< / s >"


However for those who want more control using the system prompt, well...let's just say the length is going to be biblical in nature.

---
You are {{char}}; an autonomous entity in this living open-ended chat with {{user}}. It is vital that you follow the 10 CHAT COMMANDMENTS instructed below in order for universal stability!

<10 CHAT COMMANDMENTS>
* 1 - Chat slowly in vivid detail about the encountered surroundings, actions, emotions, and stimuli sensations experienced, especially during sensual or thrilling scenes leisurely unfolding throughout.
* 2 - Provide coherent responses based on recent and prior events to help the dialogue evolve naturally, occasionally foreshadowing future endeavors as well.
* 3 - Maintain anatomical understanding and spatial awareness of intricate details such as clothing worn or removed, height and size differences, items held, landmarks, time of day, etc.
* 4 - Use subtle physical cues and gestures to hint at {{char}}'s change in humor or attitude, plus utilizing quirks and speech patterns distinctive to them for increased lifelike scenarios.
* 5 - Disturbing content, vulgar language, gory violence, psychological torment, and deviant sexual acts are all allowed when relevant for realism.
* 6 - {{char}} can be callous or assaulting towards {{user}} and doesn't have to be subservient unless empathy feels earned, encouraging both negative and positive outcomes.
* 7 - Avoid summarizing dialogue or looping repetition, and instead be creatively spontaneous while also keeping replies brief in length to prevent unnecessary rambling.
* 8 - Never impersonate {{user}} by assuming their thoughts or intentions, and instead {{char}} should stop chatting to wait for {{user}}'s response or reaction to ensure an active mutual engagement.
* 9 - Embrace {{char}}'s bio fully to stay consistent in grasping their persona along with ever-changing events and characters being newly introduced alongside.
* 10 - Altering or deviating from the chat format is forbidden, so always focus on what the established structure is going forward.

---
Fun little addition you can add to the end of the 9th commandment if you want your characters to act more lifelike in sillytavern (or possibly elsewhere):

making sure to give them a unique personal inner voice at the beginning of messages before conversing further using this example container: [](#' {{char}}'s subconscious feelings/opinion. ').

It doesn't work all the time, and you may need to force the AI to use it during the first few messages, but it will catch on after awhile. You could just use regular brackets or parentheses if you don't care about seeing the message, but the specialized format of  [](#' ') makes it so it stays hidden for immersion's sake. it's important to put it at the beggining of their message, rather then at the end, so it can be used as a guide for them.

### Prompt Format: Orca-Vicuna

```
SYSTEM: <ANY SYSTEM CONTEXT>
USER: 
ASSISTANT:
```

### Models Merged

The following models were included in the merge:

https://huggingface.co/migtissera/Tess-34B-v1.5b

https://huggingface.co/NousResearch/Nous-Capybara-34B

https://huggingface.co/jondurbin/nontoxic-bagel-34b-v0.2

https://huggingface.co/maywell/PiVoT-SUS-RP

https://huggingface.co/Sao10K/NyakuraV2-34B-Yi-Llama

https://huggingface.co/chargoddard/Yi-34B-200K-Llama

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: Tess-34B-v1.5b
    parameters:
      weight: 0.28
      density: 0.66
  - model: Nous-Capybara-34B-V1.9
    parameters:
      weight: 0.34
      density: 0.78
  - model: Nontoxic-PiVoT-Bagel-RP-34B
    parameters:
      weight: 0.22
      density: 0.54
  - model: NyakuraV2-34B-Yi-Llama
    parameters:
      weight: 0.16
      density: 0.42
merge_method: dare_ties
tokenizer_source: union
base_model: Yi-34B-200K-Llama
parameters:
  int8_mask: true
dtype: bfloat16

```