gemma-2-9b-ko-quant / mergekit_config.yml
vitus9988's picture
Upload folder using huggingface_hub
ff91623 verified
raw
history blame contribute delete
328 Bytes
models:
- model: gemma-2-9b-it-lora-merge
parameters:
density: 0.7
weight: 0.7
- model: google/gemma-2-9b-it
parameters:
density: 0.7
weight: 0.7
merge_method: dare_ties
base_model: google/gemma-2-9b
parameters:
int8_mask: true
nomalize: true
weight: 0.7
density: 0.7
dtype: bfloat16