bobbyw's picture
End of training
ae821b6 verified
|
raw
history blame
11.6 kB
metadata
license: mit
base_model: microsoft/deberta-v3-small
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: copilot_relex_v1_with_context
    results: []

copilot_relex_v1_with_context

This model is a fine-tuned version of microsoft/deberta-v3-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0299
  • Accuracy: 0.0075
  • F1: 0.0127
  • Precision: 0.0064
  • Recall: 0.8358
  • Learning Rate: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall Rate
No log 1.0 26 0.5156 0.0531 0.0154 0.0078 0.9701 0.0000
No log 2.0 52 0.3270 0.0077 0.0152 0.0077 1.0 0.0000
No log 3.0 78 0.1951 0.0077 0.0152 0.0077 1.0 0.0000
No log 4.0 104 0.1153 0.0077 0.0152 0.0077 1.0 0.0000
No log 5.0 130 0.0759 0.0077 0.0152 0.0077 1.0 0.0000
No log 6.0 156 0.0584 0.0077 0.0152 0.0077 1.0 0.0000
No log 7.0 182 0.0503 0.0077 0.0152 0.0077 1.0 0.0000
No log 8.0 208 0.0462 0.0077 0.0152 0.0077 1.0 0.0000
No log 9.0 234 0.0440 0.0077 0.0152 0.0077 1.0 0.0000
No log 10.0 260 0.0427 0.0077 0.0152 0.0077 1.0 0.0000
No log 11.0 286 0.0419 0.0077 0.0152 0.0077 1.0 0.0000
No log 12.0 312 0.0413 0.0077 0.0152 0.0077 1.0 0.0000
No log 13.0 338 0.0410 0.0077 0.0152 0.0077 1.0 0.0000
No log 14.0 364 0.0407 0.0077 0.0152 0.0077 1.0 0.0000
No log 15.0 390 0.0405 0.0077 0.0152 0.0077 1.0 0.0000
No log 16.0 416 0.0403 0.0077 0.0152 0.0077 1.0 0.0000
No log 17.0 442 0.0402 0.0077 0.0152 0.0077 1.0 0.0000
No log 18.0 468 0.0400 0.0077 0.0152 0.0077 1.0 0.0000
No log 19.0 494 0.0399 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 20.0 520 0.0397 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 21.0 546 0.0388 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 22.0 572 0.0388 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 23.0 598 0.0387 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 24.0 624 0.0375 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 25.0 650 0.0376 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 26.0 676 0.0369 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 27.0 702 0.0367 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 28.0 728 0.0373 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 29.0 754 0.0362 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 30.0 780 0.0361 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 31.0 806 0.0358 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 32.0 832 0.0355 0.0077 0.0152 0.0077 1.0 0.0000
0.1144 33.0 858 0.0329 0.0073 0.0145 0.0073 0.9552 0.0000
0.1144 34.0 884 0.0327 0.0078 0.0152 0.0077 1.0 0.0000
0.1144 35.0 910 0.0328 0.0074 0.0147 0.0074 0.9701 0.0000
0.1144 36.0 936 0.0324 0.0075 0.0147 0.0074 0.9701 0.0000
0.1144 37.0 962 0.0316 0.0075 0.0147 0.0074 0.9701 0.0000
0.1144 38.0 988 0.0326 0.0075 0.0145 0.0073 0.9552 0.0000
0.029 39.0 1014 0.0312 0.0074 0.0145 0.0073 0.9552 0.0000
0.029 40.0 1040 0.0313 0.0072 0.0141 0.0071 0.9254 0.0000
0.029 41.0 1066 0.0320 0.0073 0.0143 0.0072 0.9403 0.0000
0.029 42.0 1092 0.0316 0.0074 0.0145 0.0073 0.9552 0.0000
0.029 43.0 1118 0.0310 0.0072 0.0136 0.0069 0.8955 0.0000
0.029 44.0 1144 0.0311 0.0072 0.0141 0.0071 0.9254 0.0000
0.029 45.0 1170 0.0310 0.0072 0.0127 0.0064 0.8358 0.0000
0.029 46.0 1196 0.0312 0.0071 0.0134 0.0067 0.8806 0.0000
0.029 47.0 1222 0.0308 0.0071 0.0134 0.0067 0.8806 0.0000
0.029 48.0 1248 0.0312 0.0072 0.0136 0.0069 0.8955 0.0000
0.029 49.0 1274 0.0309 0.0073 0.0136 0.0069 0.8955 0.0000
0.029 50.0 1300 0.0307 0.0070 0.0129 0.0065 0.8507 1e-05
0.029 51.0 1326 0.0303 0.0071 0.0134 0.0067 0.8806 0.0000
0.029 52.0 1352 0.0307 0.0073 0.0134 0.0067 0.8806 0.0000
0.029 53.0 1378 0.0309 0.0073 0.0134 0.0067 0.8806 0.0000
0.029 54.0 1404 0.0312 0.0072 0.0136 0.0069 0.8955 0.0000
0.029 55.0 1430 0.0303 0.0073 0.0136 0.0069 0.8955 9e-06
0.029 56.0 1456 0.0300 0.0071 0.0132 0.0066 0.8657 0.0000
0.029 57.0 1482 0.0301 0.0069 0.0125 0.0063 0.8209 0.0000
0.0205 58.0 1508 0.0302 0.0072 0.0132 0.0066 0.8657 0.0000
0.0205 59.0 1534 0.0303 0.0071 0.0129 0.0065 0.8507 0.0000
0.0205 60.0 1560 0.0308 0.0073 0.0132 0.0066 0.8657 0.0000
0.0205 61.0 1586 0.0309 0.0074 0.0136 0.0069 0.8955 0.0000
0.0205 62.0 1612 0.0306 0.0078 0.0130 0.0065 0.8507 0.0000
0.0205 63.0 1638 0.0308 0.0077 0.0130 0.0065 0.8507 0.0000
0.0205 64.0 1664 0.0303 0.0071 0.0127 0.0064 0.8358 0.0000
0.0205 65.0 1690 0.0312 0.0077 0.0132 0.0066 0.8657 7e-06
0.0205 66.0 1716 0.0304 0.0073 0.0132 0.0066 0.8657 0.0000
0.0205 67.0 1742 0.0305 0.0073 0.0132 0.0066 0.8657 0.0000
0.0205 68.0 1768 0.0304 0.0074 0.0132 0.0066 0.8657 0.0000
0.0205 69.0 1794 0.0306 0.0072 0.0129 0.0065 0.8507 0.0000
0.0205 70.0 1820 0.0314 0.0080 0.0134 0.0068 0.8806 6e-06
0.0205 71.0 1846 0.0314 0.0075 0.0132 0.0066 0.8657 0.0000
0.0205 72.0 1872 0.0307 0.0075 0.0132 0.0066 0.8657 0.0000
0.0205 73.0 1898 0.0300 0.0075 0.0127 0.0064 0.8358 0.0000
0.0205 74.0 1924 0.0301 0.0072 0.0127 0.0064 0.8358 0.0000
0.0205 75.0 1950 0.0297 0.0075 0.0132 0.0066 0.8657 5e-06
0.0205 76.0 1976 0.0306 0.0075 0.0130 0.0065 0.8507 0.0000
0.016 77.0 2002 0.0299 0.0073 0.0125 0.0063 0.8209 0.0000
0.016 78.0 2028 0.0301 0.0074 0.0125 0.0063 0.8209 0.0000
0.016 79.0 2054 0.0301 0.0078 0.0127 0.0064 0.8358 0.0000
0.016 80.0 2080 0.0306 0.0078 0.0130 0.0065 0.8507 0.0000
0.016 81.0 2106 0.0302 0.0073 0.0125 0.0063 0.8209 0.0000
0.016 82.0 2132 0.0305 0.0073 0.0129 0.0065 0.8507 0.0000
0.016 83.0 2158 0.0303 0.0073 0.0127 0.0064 0.8358 0.0000
0.016 84.0 2184 0.0302 0.0072 0.0129 0.0065 0.8507 0.0000
0.016 85.0 2210 0.0302 0.0072 0.0127 0.0064 0.8358 3e-06
0.016 86.0 2236 0.0299 0.0072 0.0125 0.0063 0.8209 0.0000
0.016 87.0 2262 0.0296 0.0069 0.0125 0.0063 0.8209 0.0000
0.016 88.0 2288 0.0299 0.0073 0.0127 0.0064 0.8358 0.0000
0.016 89.0 2314 0.0297 0.0072 0.0125 0.0063 0.8209 0.0000
0.016 90.0 2340 0.0296 0.0073 0.0125 0.0063 0.8209 0.0000
0.016 91.0 2366 0.0299 0.0071 0.0125 0.0063 0.8209 0.0000
0.016 92.0 2392 0.0293 0.0071 0.0125 0.0063 0.8209 0.0000
0.016 93.0 2418 0.0301 0.0073 0.0127 0.0064 0.8358 0.0000
0.016 94.0 2444 0.0294 0.0071 0.0125 0.0063 0.8209 0.0000
0.016 95.0 2470 0.0296 0.0072 0.0125 0.0063 0.8209 0.0000
0.016 96.0 2496 0.0298 0.0074 0.0125 0.0063 0.8209 0.0000
0.0136 97.0 2522 0.0299 0.0073 0.0127 0.0064 0.8358 0.0000
0.0136 98.0 2548 0.0298 0.0074 0.0125 0.0063 0.8209 0.0000
0.0136 99.0 2574 0.0299 0.0075 0.0127 0.0064 0.8358 0.0000
0.0136 100.0 2600 0.0299 0.0075 0.0127 0.0064 0.8358 0.0

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1