File size: 129 Bytes
b0579e6
 
 
 
 
 
 
1
2
3
4
5
6
7
8
{
  "adapter_type": "lora",
  "lora_alpha": 16,
  "lora_dropout": 0.1,
  "lora_r": 8,
  "target_modules": ["q_proj", "v_proj"]
}