sophosympatheia
commited on
Commit
•
309f8e5
1
Parent(s):
03e2f6e
Update README.md
Browse filesRenaming model to avoid any confusion about its relationship (or lack thereof) to the zephyr 7b models.
README.md
CHANGED
@@ -9,8 +9,6 @@ language:
|
|
9 |
|
10 |
### Overview
|
11 |
|
12 |
-
*This model is in no way associated with the zephyr-7b models, the teams involved with those models, or the datasets used to train those models. I just liked this model name.*
|
13 |
-
|
14 |
This model is a frankenmerge of two custom 70b merges I made in November 2023 that were inspired by or descended from
|
15 |
my [xwin-stellarbright-erp-70b-v2 model](https://huggingface.co/sophosympatheia/xwin-stellarbright-erp-70b-v2). It features 120 layers and should weigh in at 103b parameters.
|
16 |
|
@@ -111,7 +109,7 @@ If you save this as a .json file, you can import it directly into Silly Tavern.
|
|
111 |
"first_output_sequence": "",
|
112 |
"last_output_sequence": "ASSISTANT(follow all narrative instructions; consider all available story information before replying so that all the details remain consistent; only write text as {{char}}):",
|
113 |
"activation_regex": "",
|
114 |
-
"name": "
|
115 |
}
|
116 |
```
|
117 |
### Quantizations
|
|
|
9 |
|
10 |
### Overview
|
11 |
|
|
|
|
|
12 |
This model is a frankenmerge of two custom 70b merges I made in November 2023 that were inspired by or descended from
|
13 |
my [xwin-stellarbright-erp-70b-v2 model](https://huggingface.co/sophosympatheia/xwin-stellarbright-erp-70b-v2). It features 120 layers and should weigh in at 103b parameters.
|
14 |
|
|
|
109 |
"first_output_sequence": "",
|
110 |
"last_output_sequence": "ASSISTANT(follow all narrative instructions; consider all available story information before replying so that all the details remain consistent; only write text as {{char}}):",
|
111 |
"activation_regex": "",
|
112 |
+
"name": "Rogue Rose"
|
113 |
}
|
114 |
```
|
115 |
### Quantizations
|