MarsupialAI commited on
Commit
09d625e
1 Parent(s): f5776b7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -26,4 +26,4 @@ Components of those models include Nous-Hermes-Llama2-70b, Xwin-LM-7B-V0.1, Myth
26
 
27
  # WTF is a rotating-stack merge?
28
 
29
- Inspired by Undi's work with stacked merges, Jeb Carter found that performance and creativity could be significantly improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is what I did here, creating three stacked merges with the three source models, and then doing a 1:1:1 linear merge of all three stacks. The exact merge configs can be found in the recipe.txt file.
 
26
 
27
  # WTF is a rotating-stack merge?
28
 
29
+ Inspired by Undi's experiments with stacked merges, Jeb Carter found that output quality and model initiative could be significantly improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is what I did here, creating three stacked merges with the three source models, and then doing a 1:1:1 linear merge of all three stacks. The exact merge configs can be found in the recipe.txt file.