--- license: llama2 language: - en pipeline_tag: conversational tags: - frankenmerge - merge - 124b --- # BigWeave v7.1 124b The BigWeave models aim to experimentally identify merge settings for increasing model performance. The version number merely tracks various attempts and is not a quality indicator. Only results demonstrating good performance are retained and shared. # Prompting Format Vicuna and Alpaca. # Merge process This is a merge of Xwin-LM/Xwin-LM-70B-V0.1 and Sao10K/Euryale-1.3-L2-70B. It uses the same configuration as alpindale/goliath-120b but with the ranges "fixed" since goliath omits some layers (see [this thread](https://huggingface.co/ChuckMcSneed/WinterGoliath-123b/discussions/2#65d324693ecda975d83089d3)). Merge configuration: ``` slices: - sources: - model: Xwin-LM/Xwin-LM-70B-V0.1 layer_range: [0,16] - sources: - model: Sao10K/Euryale-1.3-L2-70B layer_range: [8,24] - sources: - model: Xwin-LM/Xwin-LM-70B-V0.1 layer_range: [16,32] - sources: - model: Sao10K/Euryale-1.3-L2-70B layer_range: [24,40] - sources: - model: Xwin-LM/Xwin-LM-70B-V0.1 layer_range: [32,48] - sources: - model: Sao10K/Euryale-1.3-L2-70B layer_range: [40,56] - sources: - model: Xwin-LM/Xwin-LM-70B-V0.1 layer_range: [48,64] - sources: - model: Sao10K/Euryale-1.3-L2-70B layer_range: [56,72] - sources: - model: Xwin-LM/Xwin-LM-70B-V0.1 layer_range: [64,80] merge_method: passthrough dtype: float16 ```