salbatarni's picture
End of training
7d56597 verified
metadata
license: apache-2.0
base_model: google-bert/bert-base-cased
tags:
  - generated_from_trainer
model-index:
  - name: bert_baseline_language_task6_fold4
    results: []

bert_baseline_language_task6_fold4

This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4886
  • Qwk: 0.7157
  • Mse: 0.4886

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0294 2 2.2411 0.0 2.2411
No log 0.0588 4 1.8734 -0.0178 1.8734
No log 0.0882 6 1.6090 0.0059 1.6090
No log 0.1176 8 1.2192 0.0059 1.2192
No log 0.1471 10 0.9341 0.0059 0.9341
No log 0.1765 12 0.7364 0.1416 0.7364
No log 0.2059 14 0.6982 0.2687 0.6982
No log 0.2353 16 0.6582 0.3372 0.6582
No log 0.2647 18 0.7135 0.1462 0.7135
No log 0.2941 20 0.6798 0.2206 0.6798
No log 0.3235 22 0.5955 0.3915 0.5955
No log 0.3529 24 0.6219 0.3622 0.6219
No log 0.3824 26 0.5703 0.3992 0.5703
No log 0.4118 28 0.5099 0.4137 0.5099
No log 0.4412 30 0.5199 0.4116 0.5199
No log 0.4706 32 0.5644 0.4094 0.5644
No log 0.5 34 0.5463 0.4109 0.5463
No log 0.5294 36 0.4851 0.3907 0.4851
No log 0.5588 38 0.4850 0.4156 0.4850
No log 0.5882 40 0.5052 0.4150 0.5052
No log 0.6176 42 0.4521 0.4313 0.4521
No log 0.6471 44 0.4285 0.3921 0.4285
No log 0.6765 46 0.4230 0.4170 0.4230
No log 0.7059 48 0.4622 0.4317 0.4622
No log 0.7353 50 0.5184 0.4332 0.5184
No log 0.7647 52 0.5387 0.4339 0.5387
No log 0.7941 54 0.5345 0.4536 0.5345
No log 0.8235 56 0.4593 0.4674 0.4593
No log 0.8529 58 0.4454 0.5019 0.4454
No log 0.8824 60 0.5377 0.6217 0.5377
No log 0.9118 62 0.7227 0.6473 0.7227
No log 0.9412 64 0.6967 0.5892 0.6967
No log 0.9706 66 0.4789 0.5329 0.4789
No log 1.0 68 0.4475 0.5381 0.4475
No log 1.0294 70 0.4883 0.5168 0.4883
No log 1.0588 72 0.4881 0.5379 0.4881
No log 1.0882 74 0.4513 0.5417 0.4513
No log 1.1176 76 0.4886 0.5726 0.4886
No log 1.1471 78 0.5485 0.6355 0.5485
No log 1.1765 80 0.5511 0.6223 0.5511
No log 1.2059 82 0.4701 0.5498 0.4701
No log 1.2353 84 0.4336 0.5411 0.4336
No log 1.2647 86 0.4810 0.5556 0.4810
No log 1.2941 88 0.4668 0.5530 0.4668
No log 1.3235 90 0.3993 0.5542 0.3993
No log 1.3529 92 0.4006 0.5528 0.4006
No log 1.3824 94 0.4392 0.5585 0.4392
No log 1.4118 96 0.4205 0.5438 0.4205
No log 1.4412 98 0.4390 0.5543 0.4390
No log 1.4706 100 0.5557 0.6166 0.5557
No log 1.5 102 0.5909 0.7254 0.5909
No log 1.5294 104 0.5077 0.7405 0.5077
No log 1.5588 106 0.4091 0.6640 0.4091
No log 1.5882 108 0.4592 0.6912 0.4592
No log 1.6176 110 0.6420 0.7297 0.6420
No log 1.6471 112 0.7219 0.7019 0.7219
No log 1.6765 114 0.5558 0.7405 0.5558
No log 1.7059 116 0.3751 0.6338 0.3751
No log 1.7353 118 0.3983 0.5147 0.3983
No log 1.7647 120 0.3774 0.5266 0.3774
No log 1.7941 122 0.4202 0.6608 0.4202
No log 1.8235 124 0.5371 0.6994 0.5371
No log 1.8529 126 0.4793 0.5619 0.4793
No log 1.8824 128 0.4083 0.5331 0.4083
No log 1.9118 130 0.4079 0.5321 0.4079
No log 1.9412 132 0.3693 0.5479 0.3693
No log 1.9706 134 0.3606 0.5525 0.3606
No log 2.0 136 0.3729 0.5439 0.3729
No log 2.0294 138 0.4628 0.5409 0.4628
No log 2.0588 140 0.5351 0.6159 0.5351
No log 2.0882 142 0.5333 0.6968 0.5333
No log 2.1176 144 0.4991 0.7022 0.4991
No log 2.1471 146 0.5203 0.7229 0.5203
No log 2.1765 148 0.5320 0.7349 0.5320
No log 2.2059 150 0.5657 0.7320 0.5657
No log 2.2353 152 0.4832 0.7194 0.4832
No log 2.2647 154 0.4500 0.7072 0.4500
No log 2.2941 156 0.4751 0.7137 0.4751
No log 2.3235 158 0.5329 0.7259 0.5329
No log 2.3529 160 0.6005 0.7169 0.6005
No log 2.3824 162 0.5667 0.7338 0.5667
No log 2.4118 164 0.4374 0.7067 0.4374
No log 2.4412 166 0.4124 0.6831 0.4124
No log 2.4706 168 0.4354 0.6921 0.4354
No log 2.5 170 0.4397 0.6839 0.4397
No log 2.5294 172 0.4214 0.6446 0.4214
No log 2.5588 174 0.4354 0.6473 0.4354
No log 2.5882 176 0.4596 0.6485 0.4596
No log 2.6176 178 0.4180 0.6243 0.4180
No log 2.6471 180 0.4142 0.6336 0.4142
No log 2.6765 182 0.3653 0.5950 0.3653
No log 2.7059 184 0.3595 0.6087 0.3595
No log 2.7353 186 0.3716 0.6402 0.3716
No log 2.7647 188 0.4491 0.7032 0.4491
No log 2.7941 190 0.4727 0.7077 0.4727
No log 2.8235 192 0.4595 0.6925 0.4595
No log 2.8529 194 0.4305 0.6658 0.4305
No log 2.8824 196 0.3810 0.6041 0.3810
No log 2.9118 198 0.3683 0.5923 0.3683
No log 2.9412 200 0.3942 0.6730 0.3942
No log 2.9706 202 0.5140 0.7149 0.5140
No log 3.0 204 0.5940 0.7421 0.5940
No log 3.0294 206 0.5623 0.7362 0.5623
No log 3.0588 208 0.4522 0.7173 0.4522
No log 3.0882 210 0.4223 0.7002 0.4223
No log 3.1176 212 0.3981 0.6730 0.3981
No log 3.1471 214 0.4195 0.6878 0.4195
No log 3.1765 216 0.4291 0.7051 0.4291
No log 3.2059 218 0.4505 0.7132 0.4505
No log 3.2353 220 0.4684 0.7186 0.4684
No log 3.2647 222 0.4655 0.7164 0.4655
No log 3.2941 224 0.5119 0.7264 0.5119
No log 3.3235 226 0.5486 0.7260 0.5486
No log 3.3529 228 0.4989 0.7182 0.4989
No log 3.3824 230 0.4513 0.7010 0.4513
No log 3.4118 232 0.4255 0.6863 0.4255
No log 3.4412 234 0.4379 0.6912 0.4379
No log 3.4706 236 0.4393 0.6995 0.4393
No log 3.5 238 0.4728 0.7068 0.4728
No log 3.5294 240 0.5134 0.7080 0.5134
No log 3.5588 242 0.4998 0.7118 0.4998
No log 3.5882 244 0.4398 0.6887 0.4398
No log 3.6176 246 0.3836 0.6619 0.3836
No log 3.6471 248 0.3718 0.6220 0.3718
No log 3.6765 250 0.3892 0.6566 0.3892
No log 3.7059 252 0.4497 0.6913 0.4497
No log 3.7353 254 0.5352 0.7154 0.5352
No log 3.7647 256 0.5511 0.7103 0.5511
No log 3.7941 258 0.5689 0.7135 0.5689
No log 3.8235 260 0.5268 0.7091 0.5268
No log 3.8529 262 0.4670 0.7000 0.4670
No log 3.8824 264 0.4545 0.7162 0.4545
No log 3.9118 266 0.4777 0.7152 0.4777
No log 3.9412 268 0.4792 0.7175 0.4792
No log 3.9706 270 0.4656 0.7290 0.4656
No log 4.0 272 0.4576 0.7216 0.4576
No log 4.0294 274 0.4727 0.7174 0.4727
No log 4.0588 276 0.4847 0.7224 0.4847
No log 4.0882 278 0.4756 0.7155 0.4756
No log 4.1176 280 0.5117 0.7247 0.5117
No log 4.1471 282 0.5247 0.7208 0.5247
No log 4.1765 284 0.4861 0.7141 0.4861
No log 4.2059 286 0.4775 0.7165 0.4775
No log 4.2353 288 0.4770 0.7156 0.4770
No log 4.2647 290 0.4946 0.7130 0.4946
No log 4.2941 292 0.5091 0.7222 0.5091
No log 4.3235 294 0.4794 0.7122 0.4794
No log 4.3529 296 0.4423 0.7225 0.4423
No log 4.3824 298 0.4383 0.7204 0.4383
No log 4.4118 300 0.4611 0.7092 0.4611
No log 4.4412 302 0.4751 0.7190 0.4751
No log 4.4706 304 0.4970 0.7180 0.4970
No log 4.5 306 0.4892 0.7157 0.4892
No log 4.5294 308 0.4820 0.7191 0.4820
No log 4.5588 310 0.4822 0.7184 0.4822
No log 4.5882 312 0.4700 0.7172 0.4700
No log 4.6176 314 0.4430 0.7173 0.4430
No log 4.6471 316 0.4170 0.7125 0.4170
No log 4.6765 318 0.4099 0.7117 0.4099
No log 4.7059 320 0.4161 0.7093 0.4161
No log 4.7353 322 0.4317 0.7169 0.4317
No log 4.7647 324 0.4483 0.7169 0.4483
No log 4.7941 326 0.4626 0.7199 0.4626
No log 4.8235 328 0.4702 0.7156 0.4702
No log 4.8529 330 0.4810 0.7156 0.4810
No log 4.8824 332 0.4882 0.7155 0.4882
No log 4.9118 334 0.4916 0.7194 0.4916
No log 4.9412 336 0.4893 0.7194 0.4893
No log 4.9706 338 0.4884 0.7157 0.4884
No log 5.0 340 0.4886 0.7157 0.4886

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1