simonycl's picture
update model card README.md
883b601
|
raw
history blame
10.6 kB
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: best_model-sst-2-32-87
    results: []

best_model-sst-2-32-87

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0406
  • Accuracy: 0.8438

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 1.2928 0.8438
No log 2.0 4 1.2923 0.8438
No log 3.0 6 1.2917 0.8438
No log 4.0 8 1.2902 0.8438
0.7235 5.0 10 1.2884 0.8438
0.7235 6.0 12 1.2856 0.8438
0.7235 7.0 14 1.2829 0.8438
0.7235 8.0 16 1.2800 0.8281
0.7235 9.0 18 1.2769 0.8281
0.5899 10.0 20 1.2742 0.8281
0.5899 11.0 22 1.2710 0.8281
0.5899 12.0 24 1.2662 0.8281
0.5899 13.0 26 1.2590 0.8281
0.5899 14.0 28 1.2466 0.8281
0.6318 15.0 30 1.2287 0.8281
0.6318 16.0 32 1.2138 0.8281
0.6318 17.0 34 1.2024 0.8281
0.6318 18.0 36 1.1924 0.8281
0.6318 19.0 38 1.1838 0.8281
0.4743 20.0 40 1.1729 0.8281
0.4743 21.0 42 1.1591 0.8281
0.4743 22.0 44 1.1527 0.8281
0.4743 23.0 46 1.1459 0.8281
0.4743 24.0 48 1.1407 0.8281
0.3414 25.0 50 1.1351 0.8281
0.3414 26.0 52 1.1305 0.8281
0.3414 27.0 54 1.1230 0.8281
0.3414 28.0 56 1.1087 0.8281
0.3414 29.0 58 1.0831 0.8281
0.3141 30.0 60 1.0555 0.8281
0.3141 31.0 62 1.0313 0.8438
0.3141 32.0 64 1.0141 0.8594
0.3141 33.0 66 1.0063 0.8438
0.3141 34.0 68 0.9990 0.8438
0.1594 35.0 70 0.9916 0.8438
0.1594 36.0 72 0.9884 0.8438
0.1594 37.0 74 0.9922 0.8438
0.1594 38.0 76 1.0013 0.8281
0.1594 39.0 78 1.0097 0.8281
0.1018 40.0 80 1.0209 0.8281
0.1018 41.0 82 1.0341 0.8281
0.1018 42.0 84 1.0352 0.8281
0.1018 43.0 86 1.0284 0.8281
0.1018 44.0 88 1.0236 0.8281
0.0404 45.0 90 1.0214 0.8438
0.0404 46.0 92 1.0237 0.8594
0.0404 47.0 94 1.0233 0.875
0.0404 48.0 96 1.0223 0.875
0.0404 49.0 98 1.0187 0.875
0.0052 50.0 100 1.0160 0.8594
0.0052 51.0 102 1.0134 0.8594
0.0052 52.0 104 1.0107 0.8438
0.0052 53.0 106 1.0083 0.8438
0.0052 54.0 108 1.0061 0.8438
0.0003 55.0 110 1.0043 0.8438
0.0003 56.0 112 1.0016 0.8438
0.0003 57.0 114 0.9994 0.8438
0.0003 58.0 116 0.9955 0.8438
0.0003 59.0 118 0.9902 0.8438
0.0003 60.0 120 0.9852 0.8438
0.0003 61.0 122 0.9806 0.8438
0.0003 62.0 124 0.9791 0.8438
0.0003 63.0 126 0.9794 0.8438
0.0003 64.0 128 0.9802 0.8438
0.0003 65.0 130 0.9809 0.8438
0.0003 66.0 132 0.9816 0.8438
0.0003 67.0 134 0.9821 0.8438
0.0003 68.0 136 0.9779 0.8438
0.0003 69.0 138 0.9746 0.8281
0.0003 70.0 140 0.9719 0.8281
0.0003 71.0 142 0.9699 0.8281
0.0003 72.0 144 0.9684 0.8438
0.0003 73.0 146 0.9673 0.8438
0.0003 74.0 148 0.9665 0.8438
0.0002 75.0 150 0.9660 0.8438
0.0002 76.0 152 0.9657 0.8438
0.0002 77.0 154 0.9605 0.8438
0.0002 78.0 156 0.9545 0.8438
0.0002 79.0 158 0.9485 0.8438
0.0004 80.0 160 0.9431 0.8438
0.0004 81.0 162 0.9384 0.8438
0.0004 82.0 164 0.9349 0.8438
0.0004 83.0 166 0.9324 0.8438
0.0004 84.0 168 0.9309 0.8438
0.0002 85.0 170 0.9309 0.8438
0.0002 86.0 172 0.9313 0.8438
0.0002 87.0 174 0.9331 0.8438
0.0002 88.0 176 0.9357 0.8438
0.0002 89.0 178 0.9380 0.8438
0.0002 90.0 180 0.9404 0.8438
0.0002 91.0 182 0.9428 0.8438
0.0002 92.0 184 0.9449 0.8438
0.0002 93.0 186 0.9472 0.8438
0.0002 94.0 188 0.9495 0.8438
0.0002 95.0 190 0.9521 0.8438
0.0002 96.0 192 0.9545 0.8438
0.0002 97.0 194 0.9576 0.8438
0.0002 98.0 196 0.9619 0.8438
0.0002 99.0 198 0.9658 0.8438
0.0002 100.0 200 0.9692 0.8438
0.0002 101.0 202 0.9723 0.8438
0.0002 102.0 204 0.9748 0.8438
0.0002 103.0 206 0.9781 0.8438
0.0002 104.0 208 0.9808 0.8438
0.0001 105.0 210 0.9832 0.8438
0.0001 106.0 212 0.9856 0.8438
0.0001 107.0 214 0.9884 0.8438
0.0001 108.0 216 0.9906 0.8438
0.0001 109.0 218 0.9903 0.8438
0.0002 110.0 220 0.9888 0.8438
0.0002 111.0 222 0.9874 0.8438
0.0002 112.0 224 0.9863 0.8438
0.0002 113.0 226 0.9854 0.8438
0.0002 114.0 228 0.9848 0.8438
0.0001 115.0 230 0.9878 0.8438
0.0001 116.0 232 0.9905 0.8438
0.0001 117.0 234 0.9926 0.8438
0.0001 118.0 236 0.9952 0.8438
0.0001 119.0 238 1.0010 0.8438
0.0001 120.0 240 1.0054 0.8438
0.0001 121.0 242 1.0086 0.8438
0.0001 122.0 244 1.0124 0.8438
0.0001 123.0 246 1.0155 0.8438
0.0001 124.0 248 1.0180 0.8438
0.0001 125.0 250 1.0201 0.8438
0.0001 126.0 252 1.0219 0.8438
0.0001 127.0 254 1.0235 0.8438
0.0001 128.0 256 1.0249 0.8438
0.0001 129.0 258 1.0261 0.8438
0.0001 130.0 260 1.0271 0.8438
0.0001 131.0 262 1.0279 0.8438
0.0001 132.0 264 1.0287 0.8438
0.0001 133.0 266 1.0293 0.8438
0.0001 134.0 268 1.0297 0.8438
0.0001 135.0 270 1.0301 0.8438
0.0001 136.0 272 1.0305 0.8438
0.0001 137.0 274 1.0309 0.8438
0.0001 138.0 276 1.0314 0.8438
0.0001 139.0 278 1.0324 0.8438
0.0001 140.0 280 1.0339 0.8438
0.0001 141.0 282 1.0352 0.8438
0.0001 142.0 284 1.0364 0.8438
0.0001 143.0 286 1.0373 0.8438
0.0001 144.0 288 1.0381 0.8438
0.0001 145.0 290 1.0388 0.8438
0.0001 146.0 292 1.0394 0.8438
0.0001 147.0 294 1.0401 0.8438
0.0001 148.0 296 1.0404 0.8438
0.0001 149.0 298 1.0404 0.8438
0.0001 150.0 300 1.0406 0.8438

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.4.0
  • Tokenizers 0.13.3