aisuko commited on
Commit
386957e
1 Parent(s): 545b2bd

fine-tuning-Phi2-with-webglm-qa-with-lora

Browse files
Files changed (3) hide show
  1. README.md +2 -2
  2. adapter_config.json +5 -5
  3. adapter_model.safetensors +1 -1
README.md CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.5317
20
 
21
  ## Model description
22
 
@@ -51,7 +51,7 @@ The following hyperparameters were used during training:
51
 
52
  | Training Loss | Epoch | Step | Validation Loss |
53
  |:-------------:|:-----:|:----:|:---------------:|
54
- | 2.4146 | 0.5 | 100 | 0.5317 |
55
 
56
 
57
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.5311
20
 
21
  ## Model description
22
 
 
51
 
52
  | Training Loss | Epoch | Step | Validation Loss |
53
  |:-------------:|:-----:|:----:|:---------------:|
54
+ | 2.4092 | 0.5 | 100 | 0.5311 |
55
 
56
 
57
  ### Framework versions
adapter_config.json CHANGED
@@ -19,12 +19,12 @@
19
  "rank_pattern": {},
20
  "revision": null,
21
  "target_modules": [
22
- "k_proj",
23
- "q_proj",
24
- "fc1",
25
  "dense",
26
- "fc2",
27
- "v_proj"
 
 
 
28
  ],
29
  "task_type": "CAUSAL_LM"
30
  }
 
19
  "rank_pattern": {},
20
  "revision": null,
21
  "target_modules": [
 
 
 
22
  "dense",
23
+ "fc1",
24
+ "q_proj",
25
+ "v_proj",
26
+ "k_proj",
27
+ "fc2"
28
  ],
29
  "task_type": "CAUSAL_LM"
30
  }
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:147636a4c84a9f07ce58fb1cf8d23e0ff1c1de43c6481558f064779e66dcbd24
3
  size 94422368
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e596aa8aac7400f21d45616ccff707acdf51e1881bf2272b94cd76781abe15e1
3
  size 94422368