simonycl's picture
update model card README.md
74973ae
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: best_model-sst-2-32-21
    results: []

best_model-sst-2-32-21

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1726
  • Accuracy: 0.8281

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 1.3058 0.7812
No log 2.0 4 1.3056 0.7812
No log 3.0 6 1.3051 0.7812
No log 4.0 8 1.3041 0.7812
0.6356 5.0 10 1.3027 0.7812
0.6356 6.0 12 1.3014 0.7812
0.6356 7.0 14 1.3002 0.7812
0.6356 8.0 16 1.2983 0.7812
0.6356 9.0 18 1.2961 0.7812
0.4734 10.0 20 1.2940 0.7812
0.4734 11.0 22 1.2916 0.7812
0.4734 12.0 24 1.2888 0.7812
0.4734 13.0 26 1.2848 0.7812
0.4734 14.0 28 1.2796 0.7812
0.3467 15.0 30 1.2731 0.7812
0.3467 16.0 32 1.2665 0.7812
0.3467 17.0 34 1.2594 0.7812
0.3467 18.0 36 1.2530 0.7812
0.3467 19.0 38 1.2462 0.7969
0.2773 20.0 40 1.2393 0.7969
0.2773 21.0 42 1.2324 0.7969
0.2773 22.0 44 1.2262 0.7969
0.2773 23.0 46 1.2197 0.7969
0.2773 24.0 48 1.2104 0.7969
0.1546 25.0 50 1.1969 0.7969
0.1546 26.0 52 1.1807 0.7969
0.1546 27.0 54 1.1697 0.7969
0.1546 28.0 56 1.1534 0.7969
0.1546 29.0 58 1.1462 0.7969
0.1236 30.0 60 1.1457 0.7969
0.1236 31.0 62 1.1458 0.7969
0.1236 32.0 64 1.1435 0.7969
0.1236 33.0 66 1.1347 0.8125
0.1236 34.0 68 1.1284 0.8125
0.0234 35.0 70 1.1252 0.8125
0.0234 36.0 72 1.1232 0.8125
0.0234 37.0 74 1.1203 0.8125
0.0234 38.0 76 1.1165 0.8125
0.0234 39.0 78 1.1136 0.8125
0.0266 40.0 80 1.1115 0.8125
0.0266 41.0 82 1.1144 0.8125
0.0266 42.0 84 1.1234 0.8125
0.0266 43.0 86 1.1296 0.7969
0.0266 44.0 88 1.1241 0.8125
0.0068 45.0 90 1.1150 0.8125
0.0068 46.0 92 1.1060 0.8125
0.0068 47.0 94 1.0985 0.8125
0.0068 48.0 96 1.0928 0.8125
0.0068 49.0 98 1.0899 0.8125
0.0016 50.0 100 1.0867 0.8125
0.0016 51.0 102 1.0921 0.8125
0.0016 52.0 104 1.0992 0.8125
0.0016 53.0 106 1.1055 0.8125
0.0016 54.0 108 1.1111 0.8281
0.0014 55.0 110 1.1121 0.8125
0.0014 56.0 112 1.1059 0.8125
0.0014 57.0 114 1.0984 0.8125
0.0014 58.0 116 1.0892 0.8125
0.0014 59.0 118 1.0797 0.8125
0.0024 60.0 120 1.0721 0.8125
0.0024 61.0 122 1.0664 0.8125
0.0024 62.0 124 1.0645 0.8125
0.0024 63.0 126 1.0661 0.8125
0.0024 64.0 128 1.0679 0.8125
0.0009 65.0 130 1.0696 0.8125
0.0009 66.0 132 1.0718 0.8125
0.0009 67.0 134 1.0739 0.8125
0.0009 68.0 136 1.0782 0.8125
0.0009 69.0 138 1.0833 0.8125
0.001 70.0 140 1.0910 0.8125
0.001 71.0 142 1.1017 0.8125
0.001 72.0 144 1.1116 0.8125
0.001 73.0 146 1.1187 0.8125
0.001 74.0 148 1.1261 0.8281
0.0019 75.0 150 1.1337 0.8281
0.0019 76.0 152 1.1408 0.8281
0.0019 77.0 154 1.1447 0.8281
0.0019 78.0 156 1.1460 0.8281
0.0019 79.0 158 1.1471 0.8281
0.0007 80.0 160 1.1476 0.8281
0.0007 81.0 162 1.1378 0.8281
0.0007 82.0 164 1.1287 0.8281
0.0007 83.0 166 1.1212 0.8281
0.0007 84.0 168 1.1147 0.8281
0.0007 85.0 170 1.1090 0.8125
0.0007 86.0 172 1.1027 0.8125
0.0007 87.0 174 1.0971 0.8125
0.0007 88.0 176 1.0927 0.8125
0.0007 89.0 178 1.0898 0.8125
0.0006 90.0 180 1.0874 0.8125
0.0006 91.0 182 1.0852 0.8125
0.0006 92.0 184 1.0842 0.8125
0.0006 93.0 186 1.0864 0.8125
0.0006 94.0 188 1.0884 0.8125
0.0006 95.0 190 1.0907 0.8125
0.0006 96.0 192 1.0915 0.8125
0.0006 97.0 194 1.1069 0.8125
0.0006 98.0 196 1.1108 0.8125
0.0006 99.0 198 1.1150 0.8281
0.0025 100.0 200 1.1188 0.8281
0.0025 101.0 202 1.1223 0.8281
0.0025 102.0 204 1.1256 0.8281
0.0025 103.0 206 1.1305 0.8281
0.0025 104.0 208 1.1371 0.8281
0.0005 105.0 210 1.1437 0.8281
0.0005 106.0 212 1.1506 0.8281
0.0005 107.0 214 1.1325 0.8281
0.0005 108.0 216 1.1170 0.8281
0.0005 109.0 218 1.1045 0.8281
0.0004 110.0 220 1.0948 0.8281
0.0004 111.0 222 1.0876 0.8125
0.0004 112.0 224 1.0833 0.8281
0.0004 113.0 226 1.0805 0.8281
0.0004 114.0 228 1.0788 0.8281
0.0004 115.0 230 1.0779 0.8281
0.0004 116.0 232 1.0800 0.8281
0.0004 117.0 234 1.0818 0.8281
0.0004 118.0 236 1.0837 0.8281
0.0004 119.0 238 1.0866 0.8281
0.0004 120.0 240 1.0899 0.8281
0.0004 121.0 242 1.0929 0.8281
0.0004 122.0 244 1.0960 0.8281
0.0004 123.0 246 1.1016 0.8281
0.0004 124.0 248 1.1090 0.8281
0.0003 125.0 250 1.1159 0.8281
0.0003 126.0 252 1.1218 0.8281
0.0003 127.0 254 1.1273 0.8281
0.0003 128.0 256 1.1320 0.8281
0.0003 129.0 258 1.1378 0.8281
0.0003 130.0 260 1.1421 0.8281
0.0003 131.0 262 1.1441 0.8281
0.0003 132.0 264 1.1447 0.8281
0.0003 133.0 266 1.1452 0.8281
0.0003 134.0 268 1.1456 0.8281
0.0003 135.0 270 1.1460 0.8281
0.0003 136.0 272 1.1463 0.8281
0.0003 137.0 274 1.1459 0.8281
0.0003 138.0 276 1.1459 0.8281
0.0003 139.0 278 1.1460 0.8281
0.0003 140.0 280 1.1461 0.8281
0.0003 141.0 282 1.1465 0.8281
0.0003 142.0 284 1.1475 0.8281
0.0003 143.0 286 1.1488 0.8281
0.0003 144.0 288 1.1501 0.8281
0.0002 145.0 290 1.1512 0.8281
0.0002 146.0 292 1.1525 0.8281
0.0002 147.0 294 1.1580 0.8281
0.0002 148.0 296 1.1631 0.8281
0.0002 149.0 298 1.1680 0.8281
0.0002 150.0 300 1.1726 0.8281

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.4.0
  • Tokenizers 0.13.3