sft-microsoft-phi2-on-5w1h_dialoges
This model is a fine-tuned version of microsoft/phi-2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.0265
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 5
- total_train_batch_size: 10
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_steps: 50
- training_steps: 1000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.3814 | 0.04 | 20 | 1.0983 |
1.0977 | 0.08 | 40 | 1.0658 |
1.0743 | 0.13 | 60 | 1.0564 |
1.0782 | 0.17 | 80 | 1.0473 |
1.1027 | 0.21 | 100 | 1.0441 |
1.0761 | 0.25 | 120 | 1.0426 |
1.0741 | 0.29 | 140 | 1.0334 |
1.0251 | 0.33 | 160 | 1.0318 |
1.0733 | 0.38 | 180 | 1.0302 |
1.0358 | 0.42 | 200 | 1.0279 |
1.0738 | 0.46 | 220 | 1.0247 |
1.0178 | 0.5 | 240 | 1.0216 |
1.0539 | 0.54 | 260 | 1.0206 |
1.0411 | 0.59 | 280 | 1.0191 |
1.0932 | 0.63 | 300 | 1.0157 |
1.0703 | 0.67 | 320 | 1.0169 |
1.0338 | 0.71 | 340 | 1.0147 |
1.062 | 0.75 | 360 | 1.0141 |
1.0425 | 0.79 | 380 | 1.0123 |
1.0573 | 0.84 | 400 | 1.0133 |
0.9972 | 0.88 | 420 | 1.0119 |
1.0762 | 0.92 | 440 | 1.0125 |
1.0677 | 0.96 | 460 | 1.0090 |
1.0387 | 1.0 | 480 | 1.0069 |
0.9682 | 1.05 | 500 | 1.0138 |
1.0124 | 1.09 | 520 | 1.0107 |
0.9533 | 1.13 | 540 | 1.0122 |
0.9743 | 1.17 | 560 | 1.0115 |
0.95 | 1.21 | 580 | 1.0144 |
0.9816 | 1.25 | 600 | 1.0137 |
1.0166 | 1.3 | 620 | 1.0134 |
0.9776 | 1.34 | 640 | 1.0144 |
0.9634 | 1.38 | 660 | 1.0171 |
0.9573 | 1.42 | 680 | 1.0172 |
0.9698 | 1.46 | 700 | 1.0072 |
0.964 | 1.51 | 720 | 1.0073 |
0.9791 | 1.55 | 740 | 1.0065 |
0.9432 | 1.59 | 760 | 1.0077 |
0.9453 | 1.63 | 780 | 1.0057 |
0.9807 | 1.67 | 800 | 1.0048 |
0.9592 | 1.71 | 820 | 1.0070 |
0.9972 | 1.76 | 840 | 1.0044 |
0.9546 | 1.8 | 860 | 0.9995 |
0.9769 | 1.84 | 880 | 1.0032 |
0.9531 | 1.88 | 900 | 1.0028 |
1.0124 | 1.92 | 920 | 1.0130 |
0.96 | 1.96 | 940 | 1.0041 |
0.9912 | 2.01 | 960 | 1.0047 |
0.8583 | 2.05 | 980 | 1.0173 |
0.8038 | 2.09 | 1000 | 1.0265 |
Framework versions
- PEFT 0.7.1
- Transformers 4.36.2
- Pytorch 2.1.2
- Datasets 2.15.0
- Tokenizers 0.15.1
- Downloads last month
- 0
Model tree for ghost613/sft-microsoft-phi2-on-5w1h_dialoges
Base model
microsoft/phi-2