zephyr-beta-wizardLM-2-merge-7B / mergekit_config.yml
grimjim's picture
Initial release
a59ffc2
raw
history blame contribute delete
273 Bytes
slices:
- sources:
- model: HuggingFaceH4/zephyr-7b-beta
layer_range: [0,32]
- model: lucyknada/microsoft_WizardLM-2-7B
layer_range: [0,32]
merge_method: slerp
base_model: HuggingFaceH4/zephyr-7b-beta
parameters:
t:
- value: 0.5
dtype: bfloat16