salbatarni's picture
End of training
23a214c verified
metadata
license: apache-2.0
base_model: google-bert/bert-base-cased
tags:
  - generated_from_trainer
model-index:
  - name: bert_baseline_language_task3_fold1
    results: []

bert_baseline_language_task3_fold1

This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3495
  • Qwk: 0.7465
  • Mse: 0.3495

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0308 2 0.9241 0.0973 0.9241
No log 0.0615 4 0.7692 0.0172 0.7692
No log 0.0923 6 0.6906 0.0 0.6906
No log 0.1231 8 0.6645 0.0 0.6645
No log 0.1538 10 0.6742 0.0 0.6742
No log 0.1846 12 0.6600 0.0097 0.6600
No log 0.2154 14 0.6105 0.1283 0.6105
No log 0.2462 16 0.5742 0.0989 0.5742
No log 0.2769 18 0.5379 0.2188 0.5379
No log 0.3077 20 0.5496 0.3968 0.5496
No log 0.3385 22 0.5728 0.4810 0.5728
No log 0.3692 24 0.4980 0.5887 0.4980
No log 0.4 26 0.4483 0.5874 0.4483
No log 0.4308 28 0.4423 0.5669 0.4423
No log 0.4615 30 0.4260 0.5824 0.4260
No log 0.4923 32 0.4285 0.6024 0.4285
No log 0.5231 34 0.4691 0.6564 0.4691
No log 0.5538 36 0.4531 0.6363 0.4531
No log 0.5846 38 0.4242 0.5254 0.4242
No log 0.6154 40 0.4281 0.3450 0.4281
No log 0.6462 42 0.4759 0.3197 0.4759
No log 0.6769 44 0.5171 0.3390 0.5171
No log 0.7077 46 0.4740 0.3374 0.4740
No log 0.7385 48 0.4149 0.3145 0.4149
No log 0.7692 50 0.4579 0.6005 0.4579
No log 0.8 52 0.5321 0.6699 0.5321
No log 0.8308 54 0.5278 0.6732 0.5278
No log 0.8615 56 0.4365 0.6723 0.4365
No log 0.8923 58 0.3708 0.5577 0.3708
No log 0.9231 60 0.3647 0.4893 0.3647
No log 0.9538 62 0.3586 0.5214 0.3586
No log 0.9846 64 0.3454 0.5466 0.3454
No log 1.0154 66 0.3349 0.6349 0.3349
No log 1.0462 68 0.3784 0.6895 0.3784
No log 1.0769 70 0.3880 0.6963 0.3880
No log 1.1077 72 0.3685 0.6724 0.3685
No log 1.1385 74 0.3318 0.6633 0.3318
No log 1.1692 76 0.3129 0.5935 0.3129
No log 1.2 78 0.3051 0.6429 0.3051
No log 1.2308 80 0.3128 0.6913 0.3128
No log 1.2615 82 0.4495 0.7299 0.4495
No log 1.2923 84 0.6028 0.7010 0.6028
No log 1.3231 86 0.4935 0.7148 0.4935
No log 1.3538 88 0.3399 0.6230 0.3399
No log 1.3846 90 0.3273 0.5044 0.3273
No log 1.4154 92 0.3266 0.4930 0.3266
No log 1.4462 94 0.3575 0.6291 0.3575
No log 1.4769 96 0.4728 0.7079 0.4728
No log 1.5077 98 0.5130 0.7288 0.5130
No log 1.5385 100 0.4588 0.7313 0.4588
No log 1.5692 102 0.3353 0.7112 0.3353
No log 1.6 104 0.2876 0.5570 0.2876
No log 1.6308 106 0.3135 0.4982 0.3135
No log 1.6615 108 0.2884 0.5642 0.2884
No log 1.6923 110 0.3044 0.7035 0.3044
No log 1.7231 112 0.3390 0.7189 0.3390
No log 1.7538 114 0.3387 0.7189 0.3387
No log 1.7846 116 0.3060 0.7181 0.3060
No log 1.8154 118 0.2881 0.6756 0.2881
No log 1.8462 120 0.2881 0.6663 0.2881
No log 1.8769 122 0.3084 0.6906 0.3084
No log 1.9077 124 0.3555 0.7381 0.3555
No log 1.9385 126 0.3414 0.7463 0.3414
No log 1.9692 128 0.2947 0.6779 0.2947
No log 2.0 130 0.2831 0.5537 0.2831
No log 2.0308 132 0.2853 0.5515 0.2853
No log 2.0615 134 0.2808 0.6274 0.2808
No log 2.0923 136 0.2888 0.6663 0.2888
No log 2.1231 138 0.2990 0.6887 0.2990
No log 2.1538 140 0.3005 0.6804 0.3005
No log 2.1846 142 0.2969 0.6593 0.2969
No log 2.2154 144 0.3092 0.6859 0.3092
No log 2.2462 146 0.3575 0.7172 0.3575
No log 2.2769 148 0.5047 0.7428 0.5047
No log 2.3077 150 0.5975 0.7277 0.5975
No log 2.3385 152 0.5210 0.7478 0.5210
No log 2.3692 154 0.3886 0.7241 0.3886
No log 2.4 156 0.3019 0.6758 0.3019
No log 2.4308 158 0.2917 0.6260 0.2917
No log 2.4615 160 0.3031 0.6687 0.3031
No log 2.4923 162 0.3197 0.6870 0.3197
No log 2.5231 164 0.3623 0.7384 0.3623
No log 2.5538 166 0.3509 0.7409 0.3509
No log 2.5846 168 0.3104 0.7006 0.3104
No log 2.6154 170 0.2990 0.6717 0.2990
No log 2.6462 172 0.3004 0.6883 0.3004
No log 2.6769 174 0.3242 0.7166 0.3242
No log 2.7077 176 0.3344 0.7123 0.3344
No log 2.7385 178 0.3443 0.7146 0.3443
No log 2.7692 180 0.3725 0.7165 0.3725
No log 2.8 182 0.3582 0.7229 0.3582
No log 2.8308 184 0.3007 0.6950 0.3007
No log 2.8615 186 0.2832 0.6327 0.2832
No log 2.8923 188 0.2833 0.6634 0.2833
No log 2.9231 190 0.3170 0.7161 0.3170
No log 2.9538 192 0.3658 0.7354 0.3658
No log 2.9846 194 0.4421 0.7460 0.4421
No log 3.0154 196 0.4628 0.7556 0.4628
No log 3.0462 198 0.4077 0.7401 0.4077
No log 3.0769 200 0.3241 0.7265 0.3241
No log 3.1077 202 0.2805 0.6529 0.2805
No log 3.1385 204 0.2775 0.6296 0.2775
No log 3.1692 206 0.2868 0.6576 0.2868
No log 3.2 208 0.3327 0.7343 0.3327
No log 3.2308 210 0.3808 0.7450 0.3808
No log 3.2615 212 0.3902 0.7446 0.3902
No log 3.2923 214 0.3556 0.7431 0.3556
No log 3.3231 216 0.3444 0.7378 0.3444
No log 3.3538 218 0.3067 0.7044 0.3067
No log 3.3846 220 0.2869 0.6498 0.2869
No log 3.4154 222 0.2910 0.6745 0.2910
No log 3.4462 224 0.3184 0.7196 0.3184
No log 3.4769 226 0.3408 0.7263 0.3408
No log 3.5077 228 0.3229 0.7265 0.3229
No log 3.5385 230 0.3237 0.7234 0.3237
No log 3.5692 232 0.3386 0.7248 0.3386
No log 3.6 234 0.3175 0.7088 0.3175
No log 3.6308 236 0.3032 0.6863 0.3032
No log 3.6615 238 0.3059 0.6853 0.3059
No log 3.6923 240 0.3283 0.7173 0.3283
No log 3.7231 242 0.3330 0.7226 0.3330
No log 3.7538 244 0.3505 0.7378 0.3505
No log 3.7846 246 0.3617 0.7467 0.3617
No log 3.8154 248 0.3791 0.7474 0.3791
No log 3.8462 250 0.3584 0.7476 0.3584
No log 3.8769 252 0.3236 0.7291 0.3236
No log 3.9077 254 0.3140 0.7169 0.3140
No log 3.9385 256 0.3216 0.7300 0.3216
No log 3.9692 258 0.3348 0.7316 0.3348
No log 4.0 260 0.3488 0.7524 0.3488
No log 4.0308 262 0.3508 0.7556 0.3508
No log 4.0615 264 0.3401 0.7409 0.3401
No log 4.0923 266 0.3341 0.7408 0.3341
No log 4.1231 268 0.3301 0.7397 0.3301
No log 4.1538 270 0.3245 0.7386 0.3245
No log 4.1846 272 0.3197 0.7314 0.3197
No log 4.2154 274 0.3080 0.7152 0.3080
No log 4.2462 276 0.2973 0.6889 0.2973
No log 4.2769 278 0.2955 0.6828 0.2955
No log 4.3077 280 0.3032 0.7075 0.3032
No log 4.3385 282 0.3200 0.7297 0.3200
No log 4.3692 284 0.3547 0.7527 0.3547
No log 4.4 286 0.3780 0.7524 0.3780
No log 4.4308 288 0.3775 0.7532 0.3775
No log 4.4615 290 0.3563 0.7478 0.3563
No log 4.4923 292 0.3411 0.7414 0.3411
No log 4.5231 294 0.3259 0.7389 0.3259
No log 4.5538 296 0.3079 0.7092 0.3079
No log 4.5846 298 0.2988 0.6886 0.2988
No log 4.6154 300 0.2977 0.6884 0.2977
No log 4.6462 302 0.3039 0.6961 0.3039
No log 4.6769 304 0.3155 0.7221 0.3155
No log 4.7077 306 0.3336 0.7411 0.3336
No log 4.7385 308 0.3505 0.7475 0.3505
No log 4.7692 310 0.3624 0.7538 0.3624
No log 4.8 312 0.3655 0.7510 0.3655
No log 4.8308 314 0.3611 0.7510 0.3611
No log 4.8615 316 0.3557 0.7507 0.3557
No log 4.8923 318 0.3523 0.7467 0.3523
No log 4.9231 320 0.3497 0.7465 0.3497
No log 4.9538 322 0.3499 0.7465 0.3499
No log 4.9846 324 0.3495 0.7465 0.3495

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1