Soaring-70b-lora / adapter_config.json
minnmamin's picture
Upload adapter_config.json with huggingface_hub
fd9bdf7 verified
raw
history blame
98 Bytes
{"r": 16, "lora_alpha": 32, "target_modules": ["q_proj", "k_proj", "v_proj"], "peft_type": "LORA"}