salbatarni's picture
End of training
18e34e9 verified
metadata
license: apache-2.0
base_model: google-bert/bert-base-cased
tags:
  - generated_from_trainer
model-index:
  - name: bert_baseline_language_task3_fold2
    results: []

bert_baseline_language_task3_fold2

This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4310
  • Qwk: 0.6497
  • Mse: 0.4312

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0308 2 0.8667 0.1286 0.8665
No log 0.0615 4 0.6721 0.0 0.6724
No log 0.0923 6 0.5799 0.0 0.5808
No log 0.1231 8 0.5723 0.0 0.5735
No log 0.1538 10 0.5882 0.0 0.5895
No log 0.1846 12 0.7119 -0.0229 0.7129
No log 0.2154 14 0.5587 0.0032 0.5596
No log 0.2462 16 0.5222 0.0 0.5229
No log 0.2769 18 0.5158 0.0 0.5165
No log 0.3077 20 0.4875 0.0 0.4881
No log 0.3385 22 0.4688 0.0096 0.4693
No log 0.3692 24 0.4535 0.0190 0.4541
No log 0.4 26 0.4426 0.0553 0.4432
No log 0.4308 28 0.4235 0.1569 0.4238
No log 0.4615 30 0.4035 0.2237 0.4036
No log 0.4923 32 0.3861 0.3251 0.3860
No log 0.5231 34 0.3859 0.3924 0.3856
No log 0.5538 36 0.4097 0.3681 0.4092
No log 0.5846 38 0.3858 0.4320 0.3853
No log 0.6154 40 0.3829 0.5218 0.3826
No log 0.6462 42 0.4023 0.5843 0.4021
No log 0.6769 44 0.3627 0.5516 0.3623
No log 0.7077 46 0.3448 0.5242 0.3445
No log 0.7385 48 0.3390 0.5750 0.3387
No log 0.7692 50 0.3549 0.6159 0.3548
No log 0.8 52 0.3917 0.6282 0.3917
No log 0.8308 54 0.4120 0.6185 0.4122
No log 0.8615 56 0.3578 0.5993 0.3579
No log 0.8923 58 0.3245 0.5054 0.3244
No log 0.9231 60 0.3458 0.5575 0.3458
No log 0.9538 62 0.4763 0.6243 0.4766
No log 0.9846 64 0.4953 0.6201 0.4955
No log 1.0154 66 0.3697 0.5996 0.3697
No log 1.0462 68 0.3512 0.5810 0.3512
No log 1.0769 70 0.3414 0.5767 0.3413
No log 1.1077 72 0.3922 0.6309 0.3922
No log 1.1385 74 0.4247 0.6202 0.4247
No log 1.1692 76 0.3777 0.6191 0.3776
No log 1.2 78 0.3634 0.5857 0.3633
No log 1.2308 80 0.3498 0.5617 0.3495
No log 1.2615 82 0.3438 0.5763 0.3437
No log 1.2923 84 0.3736 0.6122 0.3736
No log 1.3231 86 0.3695 0.6156 0.3695
No log 1.3538 88 0.3555 0.6014 0.3554
No log 1.3846 90 0.3487 0.5962 0.3486
No log 1.4154 92 0.3489 0.6098 0.3488
No log 1.4462 94 0.3380 0.5808 0.3379
No log 1.4769 96 0.3571 0.6081 0.3572
No log 1.5077 98 0.3722 0.6171 0.3724
No log 1.5385 100 0.4097 0.6365 0.4100
No log 1.5692 102 0.3790 0.6108 0.3794
No log 1.6 104 0.3796 0.6035 0.3801
No log 1.6308 106 0.3557 0.5632 0.3561
No log 1.6615 108 0.3973 0.5962 0.3977
No log 1.6923 110 0.4461 0.6243 0.4465
No log 1.7231 112 0.4083 0.6133 0.4087
No log 1.7538 114 0.3984 0.6157 0.3988
No log 1.7846 116 0.3726 0.6102 0.3730
No log 1.8154 118 0.3767 0.5977 0.3771
No log 1.8462 120 0.4635 0.6273 0.4640
No log 1.8769 122 0.6099 0.6037 0.6106
No log 1.9077 124 0.6275 0.5888 0.6282
No log 1.9385 126 0.5350 0.6223 0.5356
No log 1.9692 128 0.4031 0.6213 0.4034
No log 2.0 130 0.3393 0.5969 0.3394
No log 2.0308 132 0.3287 0.5599 0.3288
No log 2.0615 134 0.3515 0.5897 0.3516
No log 2.0923 136 0.3996 0.6258 0.3997
No log 2.1231 138 0.4641 0.6169 0.4643
No log 2.1538 140 0.4542 0.6158 0.4544
No log 2.1846 142 0.4004 0.6351 0.4005
No log 2.2154 144 0.3401 0.5887 0.3402
No log 2.2462 146 0.3235 0.5328 0.3235
No log 2.2769 148 0.3321 0.5753 0.3321
No log 2.3077 150 0.3804 0.6323 0.3805
No log 2.3385 152 0.4133 0.6239 0.4135
No log 2.3692 154 0.3998 0.6317 0.4000
No log 2.4 156 0.3526 0.6064 0.3527
No log 2.4308 158 0.3562 0.6005 0.3564
No log 2.4615 160 0.3973 0.6415 0.3975
No log 2.4923 162 0.4010 0.6347 0.4012
No log 2.5231 164 0.3810 0.6306 0.3812
No log 2.5538 166 0.3914 0.6271 0.3916
No log 2.5846 168 0.4101 0.6216 0.4104
No log 2.6154 170 0.3935 0.6240 0.3938
No log 2.6462 172 0.3646 0.6070 0.3649
No log 2.6769 174 0.3806 0.6188 0.3809
No log 2.7077 176 0.3976 0.6294 0.3980
No log 2.7385 178 0.3807 0.6188 0.3810
No log 2.7692 180 0.3702 0.6231 0.3705
No log 2.8 182 0.3468 0.5942 0.3469
No log 2.8308 184 0.3611 0.6284 0.3613
No log 2.8615 186 0.3921 0.6324 0.3923
No log 2.8923 188 0.4505 0.6341 0.4508
No log 2.9231 190 0.5436 0.6130 0.5441
No log 2.9538 192 0.5516 0.5987 0.5521
No log 2.9846 194 0.4740 0.6272 0.4744
No log 3.0154 196 0.3665 0.6255 0.3667
No log 3.0462 198 0.3328 0.5737 0.3329
No log 3.0769 200 0.3368 0.5899 0.3369
No log 3.1077 202 0.3630 0.6250 0.3631
No log 3.1385 204 0.4167 0.6402 0.4169
No log 3.1692 206 0.4774 0.6409 0.4776
No log 3.2 208 0.4657 0.6437 0.4659
No log 3.2308 210 0.4932 0.6426 0.4933
No log 3.2615 212 0.4736 0.6453 0.4737
No log 3.2923 214 0.4233 0.6436 0.4234
No log 3.3231 216 0.4093 0.6436 0.4094
No log 3.3538 218 0.4126 0.6488 0.4127
No log 3.3846 220 0.4110 0.6481 0.4110
No log 3.4154 222 0.4596 0.6397 0.4597
No log 3.4462 224 0.4659 0.6426 0.4661
No log 3.4769 226 0.4215 0.6391 0.4216
No log 3.5077 228 0.3766 0.6305 0.3766
No log 3.5385 230 0.3600 0.6084 0.3600
No log 3.5692 232 0.3712 0.6256 0.3712
No log 3.6 234 0.3671 0.6171 0.3671
No log 3.6308 236 0.3682 0.6262 0.3683
No log 3.6615 238 0.3608 0.6184 0.3609
No log 3.6923 240 0.3834 0.6317 0.3835
No log 3.7231 242 0.4443 0.6511 0.4445
No log 3.7538 244 0.4779 0.6435 0.4781
No log 3.7846 246 0.4705 0.6442 0.4708
No log 3.8154 248 0.4286 0.6526 0.4288
No log 3.8462 250 0.3918 0.6387 0.3920
No log 3.8769 252 0.3711 0.6330 0.3712
No log 3.9077 254 0.3700 0.6284 0.3701
No log 3.9385 256 0.3633 0.6233 0.3633
No log 3.9692 258 0.3682 0.6329 0.3683
No log 4.0 260 0.3658 0.6262 0.3658
No log 4.0308 262 0.3922 0.6462 0.3923
No log 4.0615 264 0.4182 0.6565 0.4183
No log 4.0923 266 0.4467 0.6587 0.4468
No log 4.1231 268 0.4401 0.6663 0.4403
No log 4.1538 270 0.4293 0.6641 0.4294
No log 4.1846 272 0.4237 0.6599 0.4238
No log 4.2154 274 0.4467 0.6677 0.4468
No log 4.2462 276 0.4703 0.6726 0.4705
No log 4.2769 278 0.4826 0.6690 0.4828
No log 4.3077 280 0.4852 0.6654 0.4853
No log 4.3385 282 0.4567 0.6699 0.4568
No log 4.3692 284 0.4361 0.6591 0.4361
No log 4.4 286 0.4165 0.6535 0.4165
No log 4.4308 288 0.4167 0.6511 0.4168
No log 4.4615 290 0.4213 0.6524 0.4213
No log 4.4923 292 0.4168 0.6545 0.4168
No log 4.5231 294 0.4303 0.6461 0.4304
No log 4.5538 296 0.4488 0.6557 0.4490
No log 4.5846 298 0.4573 0.6534 0.4574
No log 4.6154 300 0.4809 0.6570 0.4811
No log 4.6462 302 0.4878 0.6538 0.4881
No log 4.6769 304 0.4771 0.6499 0.4773
No log 4.7077 306 0.4567 0.6516 0.4569
No log 4.7385 308 0.4453 0.6494 0.4455
No log 4.7692 310 0.4356 0.6487 0.4358
No log 4.8 312 0.4348 0.6465 0.4349
No log 4.8308 314 0.4349 0.6465 0.4351
No log 4.8615 316 0.4350 0.6458 0.4352
No log 4.8923 318 0.4341 0.6497 0.4343
No log 4.9231 320 0.4312 0.6504 0.4313
No log 4.9538 322 0.4310 0.6504 0.4312
No log 4.9846 324 0.4310 0.6497 0.4312

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1