simonycl's picture
update model card README.md
17b08b8
|
raw
history blame
10.6 kB
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: best_model-sst-2-32-42
    results: []

best_model-sst-2-32-42

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2575
  • Accuracy: 0.8281

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 1.2950 0.8281
No log 2.0 4 1.2965 0.8281
No log 3.0 6 1.2971 0.8281
No log 4.0 8 1.2972 0.8281
0.3346 5.0 10 1.2994 0.8281
0.3346 6.0 12 1.3037 0.8281
0.3346 7.0 14 1.3082 0.8281
0.3346 8.0 16 1.3140 0.8281
0.3346 9.0 18 1.3212 0.8281
0.2586 10.0 20 1.3285 0.8281
0.2586 11.0 22 1.3346 0.8281
0.2586 12.0 24 1.3404 0.8281
0.2586 13.0 26 1.3443 0.8281
0.2586 14.0 28 1.3499 0.8281
0.2171 15.0 30 1.3534 0.8281
0.2171 16.0 32 1.3551 0.8281
0.2171 17.0 34 1.3544 0.8281
0.2171 18.0 36 1.3531 0.8281
0.2171 19.0 38 1.3516 0.8281
0.1549 20.0 40 1.3494 0.8281
0.1549 21.0 42 1.3471 0.8281
0.1549 22.0 44 1.3446 0.8281
0.1549 23.0 46 1.3414 0.8281
0.1549 24.0 48 1.3351 0.8281
0.0613 25.0 50 1.3277 0.8281
0.0613 26.0 52 1.3201 0.8281
0.0613 27.0 54 1.3110 0.8281
0.0613 28.0 56 1.2974 0.8281
0.0613 29.0 58 1.2847 0.8281
0.0094 30.0 60 1.2767 0.8281
0.0094 31.0 62 1.2697 0.8281
0.0094 32.0 64 1.2638 0.8281
0.0094 33.0 66 1.2625 0.8281
0.0094 34.0 68 1.2633 0.8281
0.0004 35.0 70 1.2642 0.8281
0.0004 36.0 72 1.2757 0.8281
0.0004 37.0 74 1.2783 0.8281
0.0004 38.0 76 1.2813 0.8281
0.0004 39.0 78 1.2892 0.8281
0.0074 40.0 80 1.2990 0.8281
0.0074 41.0 82 1.3111 0.8281
0.0074 42.0 84 1.3233 0.8281
0.0074 43.0 86 1.3317 0.8281
0.0074 44.0 88 1.3371 0.8281
0.0004 45.0 90 1.3410 0.8281
0.0004 46.0 92 1.3436 0.8281
0.0004 47.0 94 1.3456 0.8281
0.0004 48.0 96 1.3471 0.8281
0.0004 49.0 98 1.3489 0.8281
0.0005 50.0 100 1.3488 0.8281
0.0005 51.0 102 1.3429 0.8281
0.0005 52.0 104 1.3365 0.8281
0.0005 53.0 106 1.3305 0.8281
0.0005 54.0 108 1.3247 0.8281
0.0003 55.0 110 1.3195 0.8281
0.0003 56.0 112 1.3151 0.8281
0.0003 57.0 114 1.2921 0.8281
0.0003 58.0 116 1.2717 0.8281
0.0003 59.0 118 1.2551 0.8281
0.0166 60.0 120 1.2421 0.8281
0.0166 61.0 122 1.2590 0.8281
0.0166 62.0 124 1.2739 0.8281
0.0166 63.0 126 1.2861 0.8281
0.0166 64.0 128 1.2958 0.8281
0.0003 65.0 130 1.3039 0.8281
0.0003 66.0 132 1.3103 0.8281
0.0003 67.0 134 1.3126 0.8281
0.0003 68.0 136 1.3125 0.8281
0.0003 69.0 138 1.3125 0.8281
0.0002 70.0 140 1.3128 0.8281
0.0002 71.0 142 1.3131 0.8281
0.0002 72.0 144 1.3135 0.8281
0.0002 73.0 146 1.3141 0.8281
0.0002 74.0 148 1.3147 0.8281
0.0004 75.0 150 1.3289 0.8281
0.0004 76.0 152 1.3274 0.8281
0.0004 77.0 154 1.3260 0.8281
0.0004 78.0 156 1.3251 0.8281
0.0004 79.0 158 1.3523 0.8281
0.0008 80.0 160 1.3691 0.8281
0.0008 81.0 162 1.3789 0.8281
0.0008 82.0 164 1.3844 0.8281
0.0008 83.0 166 1.3873 0.8281
0.0008 84.0 168 1.3885 0.8281
0.0002 85.0 170 1.3889 0.8281
0.0002 86.0 172 1.3889 0.8281
0.0002 87.0 174 1.3888 0.8281
0.0002 88.0 176 1.3888 0.8281
0.0002 89.0 178 1.3890 0.8281
0.0002 90.0 180 1.3893 0.8281
0.0002 91.0 182 1.3898 0.8281
0.0002 92.0 184 1.3905 0.8281
0.0002 93.0 186 1.3913 0.8281
0.0002 94.0 188 1.3927 0.8281
0.0002 95.0 190 1.3938 0.8281
0.0002 96.0 192 1.3947 0.8281
0.0002 97.0 194 1.3954 0.8281
0.0002 98.0 196 1.3960 0.8281
0.0002 99.0 198 1.3967 0.8281
0.0002 100.0 200 1.3975 0.8281
0.0002 101.0 202 1.3984 0.8281
0.0002 102.0 204 1.3993 0.8281
0.0002 103.0 206 1.4001 0.8281
0.0002 104.0 208 1.4008 0.8281
0.0001 105.0 210 1.4014 0.8281
0.0001 106.0 212 1.4020 0.8281
0.0001 107.0 214 1.4026 0.8281
0.0001 108.0 216 1.4030 0.8281
0.0001 109.0 218 1.4035 0.8281
0.0001 110.0 220 1.4040 0.8281
0.0001 111.0 222 1.4046 0.8281
0.0001 112.0 224 1.4051 0.8281
0.0001 113.0 226 1.4057 0.8281
0.0001 114.0 228 1.4064 0.8281
0.0001 115.0 230 1.4071 0.8281
0.0001 116.0 232 1.4078 0.8281
0.0001 117.0 234 1.4085 0.8281
0.0001 118.0 236 1.4092 0.8281
0.0001 119.0 238 1.4099 0.8281
0.0001 120.0 240 1.4106 0.8281
0.0001 121.0 242 1.4108 0.8281
0.0001 122.0 244 1.4081 0.8281
0.0001 123.0 246 1.4055 0.8281
0.0001 124.0 248 1.4032 0.8281
0.0001 125.0 250 1.4011 0.8281
0.0001 126.0 252 1.3995 0.8281
0.0001 127.0 254 1.3982 0.8281
0.0001 128.0 256 1.3973 0.8281
0.0001 129.0 258 1.3967 0.8281
0.0001 130.0 260 1.3963 0.8281
0.0001 131.0 262 1.3962 0.8281
0.0001 132.0 264 1.3962 0.8281
0.0001 133.0 266 1.3965 0.8281
0.0001 134.0 268 1.3970 0.8281
0.0001 135.0 270 1.3989 0.8281
0.0001 136.0 272 1.4012 0.8281
0.0001 137.0 274 1.4035 0.8281
0.0001 138.0 276 1.4052 0.8281
0.0001 139.0 278 1.4064 0.8281
0.0002 140.0 280 1.3703 0.8281
0.0002 141.0 282 1.2995 0.8438
0.0002 142.0 284 1.2572 0.8281
0.0002 143.0 286 1.2224 0.8281
0.0002 144.0 288 1.2120 0.8438
0.0001 145.0 290 1.2242 0.8281
0.0001 146.0 292 1.2377 0.8281
0.0001 147.0 294 1.2477 0.8281
0.0001 148.0 296 1.2542 0.8281
0.0001 149.0 298 1.2575 0.8281
0.0002 150.0 300 1.2575 0.8281

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.4.0
  • Tokenizers 0.13.3