File size: 920 Bytes
1cdc9ea 5816464 a0f7bea 5816464 a0f7bea 5816464 1cdc9ea 5816464 a0f7bea 5816464 a0f7bea 5816464 1cdc9ea a0f7bea 5816464 a0f7bea 5816464 1cdc9ea |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 |
Merge recipe Stack A - model: MiquDPO layer_range: [0, 20] - model: MediumRare layer_range: [10, 30] - model: Senku layer_range: [25, 45] - model: MediumRare layer_range: [35, 55] - model: Senku layer_range: [50, 70] - model: MiquDPO layer_range: [60, 80] Stack B - model: Senku layer_range: [0, 20] - model: MiquDPO layer_range: [10, 30] - model: MediumRare layer_range: [25, 45] - model: MiquDPO layer_range: [35, 55] - model: MediumRare layer_range: [50, 70] - model: Senku layer_range: [60, 80] Stack C - model: MediumRare layer_range: [0, 20] - model: Senku layer_range: [10, 30] - model: MiquDPO layer_range: [25, 45] - model: Senku layer_range: [35, 55] - model: MiquDPO layer_range: [50, 70] - model: MediumRare layer_range: [60, 80] Linear merge - model: StackA weight: 1.0 - model: StackB weight: 1.0 - model: StackC weight: 1.0 |