File size: 98 Bytes
fd9bdf7 |
1 |
{"r": 16, "lora_alpha": 32, "target_modules": ["q_proj", "k_proj", "v_proj"], "peft_type": "LORA"} |
fd9bdf7 |
1 |
{"r": 16, "lora_alpha": 32, "target_modules": ["q_proj", "k_proj", "v_proj"], "peft_type": "LORA"} |