Edit model card

bert_baseline_language_task3_fold4

This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4340
  • Qwk: 0.5769
  • Mse: 0.4343

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0308 2 0.9109 0.1135 0.9120
No log 0.0615 4 0.7302 0.0091 0.7310
No log 0.0923 6 0.6473 0.0 0.6480
No log 0.1231 8 0.6464 0.0 0.6469
No log 0.1538 10 0.6452 0.0 0.6458
No log 0.1846 12 0.6121 0.0 0.6126
No log 0.2154 14 0.6376 0.0 0.6382
No log 0.2462 16 0.6836 0.0 0.6841
No log 0.2769 18 0.7015 0.0 0.7020
No log 0.3077 20 0.6505 0.0 0.6510
No log 0.3385 22 0.5816 0.0 0.5822
No log 0.3692 24 0.5565 0.0 0.5571
No log 0.4 26 0.5383 0.0 0.5389
No log 0.4308 28 0.5152 0.0091 0.5157
No log 0.4615 30 0.4960 0.0270 0.4965
No log 0.4923 32 0.4781 0.1041 0.4785
No log 0.5231 34 0.4595 0.1801 0.4599
No log 0.5538 36 0.4648 0.2938 0.4653
No log 0.5846 38 0.4753 0.4514 0.4758
No log 0.6154 40 0.4224 0.4998 0.4225
No log 0.6462 42 0.4749 0.3528 0.4746
No log 0.6769 44 0.5212 0.3219 0.5210
No log 0.7077 46 0.4233 0.3754 0.4233
No log 0.7385 48 0.4789 0.5565 0.4792
No log 0.7692 50 0.6057 0.5461 0.6062
No log 0.8 52 0.7260 0.5404 0.7266
No log 0.8308 54 0.6614 0.5465 0.6619
No log 0.8615 56 0.5122 0.4793 0.5127
No log 0.8923 58 0.4503 0.2917 0.4507
No log 0.9231 60 0.4273 0.2707 0.4277
No log 0.9538 62 0.4133 0.3275 0.4136
No log 0.9846 64 0.4075 0.4202 0.4076
No log 1.0154 66 0.4159 0.5118 0.4160
No log 1.0462 68 0.4456 0.5714 0.4459
No log 1.0769 70 0.4420 0.5985 0.4424
No log 1.1077 72 0.4074 0.5653 0.4076
No log 1.1385 74 0.3829 0.4871 0.3830
No log 1.1692 76 0.3715 0.4828 0.3717
No log 1.2 78 0.3620 0.4863 0.3623
No log 1.2308 80 0.3671 0.5415 0.3674
No log 1.2615 82 0.4030 0.5946 0.4033
No log 1.2923 84 0.4396 0.5992 0.4399
No log 1.3231 86 0.4332 0.5951 0.4334
No log 1.3538 88 0.3744 0.5906 0.3747
No log 1.3846 90 0.3379 0.5145 0.3381
No log 1.4154 92 0.3385 0.5125 0.3387
No log 1.4462 94 0.3708 0.5662 0.3711
No log 1.4769 96 0.4722 0.6023 0.4724
No log 1.5077 98 0.4995 0.6024 0.4998
No log 1.5385 100 0.4179 0.5810 0.4182
No log 1.5692 102 0.3509 0.5168 0.3512
No log 1.6 104 0.3364 0.5036 0.3367
No log 1.6308 106 0.3356 0.5394 0.3359
No log 1.6615 108 0.3813 0.6251 0.3815
No log 1.6923 110 0.4242 0.6416 0.4245
No log 1.7231 112 0.4424 0.6371 0.4427
No log 1.7538 114 0.4157 0.6542 0.4160
No log 1.7846 116 0.3935 0.6434 0.3937
No log 1.8154 118 0.3787 0.6387 0.3790
No log 1.8462 120 0.3835 0.6349 0.3837
No log 1.8769 122 0.4416 0.6288 0.4419
No log 1.9077 124 0.5026 0.6156 0.5028
No log 1.9385 126 0.4854 0.6136 0.4857
No log 1.9692 128 0.4068 0.6195 0.4071
No log 2.0 130 0.3600 0.5947 0.3603
No log 2.0308 132 0.3585 0.5905 0.3587
No log 2.0615 134 0.3569 0.5810 0.3570
No log 2.0923 136 0.3751 0.5943 0.3752
No log 2.1231 138 0.4052 0.6099 0.4054
No log 2.1538 140 0.4135 0.6173 0.4137
No log 2.1846 142 0.3853 0.6116 0.3854
No log 2.2154 144 0.3977 0.6213 0.3978
No log 2.2462 146 0.4319 0.6339 0.4321
No log 2.2769 148 0.4184 0.6320 0.4185
No log 2.3077 150 0.4158 0.6281 0.4159
No log 2.3385 152 0.4513 0.6319 0.4514
No log 2.3692 154 0.4589 0.6325 0.4590
No log 2.4 156 0.4841 0.6217 0.4843
No log 2.4308 158 0.4072 0.6167 0.4073
No log 2.4615 160 0.3546 0.5723 0.3546
No log 2.4923 162 0.3616 0.5908 0.3617
No log 2.5231 164 0.3942 0.5987 0.3944
No log 2.5538 166 0.4663 0.6275 0.4666
No log 2.5846 168 0.4424 0.6243 0.4426
No log 2.6154 170 0.4280 0.6225 0.4282
No log 2.6462 172 0.3988 0.6006 0.3990
No log 2.6769 174 0.4066 0.5998 0.4068
No log 2.7077 176 0.3877 0.5780 0.3879
No log 2.7385 178 0.4221 0.5808 0.4223
No log 2.7692 180 0.4523 0.5856 0.4526
No log 2.8 182 0.4291 0.5832 0.4294
No log 2.8308 184 0.3831 0.5593 0.3833
No log 2.8615 186 0.3714 0.5692 0.3716
No log 2.8923 188 0.3999 0.5857 0.4001
No log 2.9231 190 0.4673 0.6274 0.4676
No log 2.9538 192 0.5013 0.6144 0.5016
No log 2.9846 194 0.4775 0.6045 0.4777
No log 3.0154 196 0.4092 0.6073 0.4094
No log 3.0462 198 0.3818 0.5803 0.3820
No log 3.0769 200 0.3905 0.5896 0.3907
No log 3.1077 202 0.4430 0.6144 0.4432
No log 3.1385 204 0.4667 0.6174 0.4669
No log 3.1692 206 0.4787 0.6228 0.4789
No log 3.2 208 0.4414 0.6197 0.4416
No log 3.2308 210 0.3774 0.6170 0.3775
No log 3.2615 212 0.3546 0.5664 0.3546
No log 3.2923 214 0.3575 0.5688 0.3576
No log 3.3231 216 0.3930 0.6024 0.3932
No log 3.3538 218 0.4818 0.6159 0.4821
No log 3.3846 220 0.5077 0.6107 0.5080
No log 3.4154 222 0.4607 0.6259 0.4609
No log 3.4462 224 0.4252 0.6139 0.4254
No log 3.4769 226 0.3971 0.6024 0.3974
No log 3.5077 228 0.3929 0.5945 0.3931
No log 3.5385 230 0.3988 0.5954 0.3990
No log 3.5692 232 0.4157 0.5997 0.4160
No log 3.6 234 0.4042 0.5909 0.4044
No log 3.6308 236 0.3802 0.5811 0.3804
No log 3.6615 238 0.3662 0.5680 0.3663
No log 3.6923 240 0.3772 0.5796 0.3774
No log 3.7231 242 0.4299 0.5936 0.4302
No log 3.7538 244 0.5352 0.6100 0.5356
No log 3.7846 246 0.6217 0.6080 0.6221
No log 3.8154 248 0.6413 0.5945 0.6417
No log 3.8462 250 0.5878 0.6027 0.5882
No log 3.8769 252 0.4979 0.6006 0.4982
No log 3.9077 254 0.4257 0.5829 0.4260
No log 3.9385 256 0.3851 0.5768 0.3853
No log 3.9692 258 0.3790 0.5720 0.3792
No log 4.0 260 0.3900 0.5769 0.3902
No log 4.0308 262 0.4249 0.6061 0.4252
No log 4.0615 264 0.4773 0.6123 0.4776
No log 4.0923 266 0.5001 0.6182 0.5005
No log 4.1231 268 0.4873 0.6129 0.4877
No log 4.1538 270 0.4502 0.6009 0.4505
No log 4.1846 272 0.4219 0.5944 0.4222
No log 4.2154 274 0.4092 0.5796 0.4095
No log 4.2462 276 0.4165 0.5812 0.4168
No log 4.2769 278 0.4317 0.6021 0.4320
No log 4.3077 280 0.4375 0.5986 0.4378
No log 4.3385 282 0.4378 0.5945 0.4381
No log 4.3692 284 0.4515 0.5955 0.4518
No log 4.4 286 0.4560 0.5918 0.4563
No log 4.4308 288 0.4702 0.5890 0.4705
No log 4.4615 290 0.4805 0.5937 0.4809
No log 4.4923 292 0.4797 0.5914 0.4800
No log 4.5231 294 0.4620 0.5818 0.4623
No log 4.5538 296 0.4500 0.5913 0.4503
No log 4.5846 298 0.4468 0.5909 0.4471
No log 4.6154 300 0.4467 0.5913 0.4470
No log 4.6462 302 0.4462 0.5913 0.4465
No log 4.6769 304 0.4411 0.5913 0.4414
No log 4.7077 306 0.4387 0.5866 0.4390
No log 4.7385 308 0.4366 0.5796 0.4369
No log 4.7692 310 0.4318 0.5703 0.4321
No log 4.8 312 0.4290 0.5699 0.4293
No log 4.8308 314 0.4322 0.5723 0.4324
No log 4.8615 316 0.4341 0.5769 0.4344
No log 4.8923 318 0.4340 0.5769 0.4343
No log 4.9231 320 0.4352 0.5769 0.4355
No log 4.9538 322 0.4348 0.5727 0.4351
No log 4.9846 324 0.4340 0.5769 0.4343

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
108M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for salbatarni/bert_baseline_language_task3_fold4

Finetuned
(1906)
this model