MarsupialAI
commited on
Commit
•
6555c41
1
Parent(s):
528ff5c
Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ tags:
|
|
11 |
# Kitchen Sink 103b
|
12 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/QFmPxADHAqMf3Wb_Xt1ry.jpeg)
|
13 |
|
14 |
-
This model is a rotating-stack merge of three 70b models in a 103b (120 layer) configuration inspired by Venus 103b. The result is a large model that contains a little bit of everything - including the kitchen sink.
|
15 |
|
16 |
Component models for the rotating stack are
|
17 |
- royallab/Aetheria-L2-70B
|
@@ -33,4 +33,4 @@ Seems to have the strongest affinity for Alpaca prompts, but Vicuna works as wel
|
|
33 |
|
34 |
|
35 |
# WTF is a rotating-stack merge?
|
36 |
-
Inspired by Undi's experiments with stacked merges, Jeb Carter found that output quality and model initiative could be significantly improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is what I did here
|
|
|
11 |
# Kitchen Sink 103b
|
12 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/QFmPxADHAqMf3Wb_Xt1ry.jpeg)
|
13 |
|
14 |
+
This model is a rotating-stack merge of three 70b models in a 103b (120 layer) configuration inspired by Venus 103b. The result of this "frankenmerge" is a large model that contains a little bit of everything - including the kitchen sink. RP, chat, storywriting, and instruct are all well supported. It may or may not code well - I lack the expertise to test it in that capacity.
|
15 |
|
16 |
Component models for the rotating stack are
|
17 |
- royallab/Aetheria-L2-70B
|
|
|
33 |
|
34 |
|
35 |
# WTF is a rotating-stack merge?
|
36 |
+
Inspired by Undi's experiments with stacked merges, Jeb Carter found that output quality and model initiative could be significantly improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is what I did here. I created three passthrough stacked merges using the three source models (rotating the model order in each stack), and then doing a linear merge of all three stacks. The exact merge configs can be found in the recipe.txt file.
|