magstral-123b / mergekit_config.yml
lodrick-the-lafted's picture
Upload folder using huggingface_hub
f3a3e21 verified
raw
history blame contribute delete
234 Bytes
slices:
- sources:
- model: mistral-large
layer_range: [0, 88]
- model: magnum-v2-123b
layer_range: [0, 88]
merge_method: slerp
base_model: mistral-large
parameters:
t:
- value: 0.5
dtype: bfloat16