bert_baseline_language_task3_fold0
This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3754
- Qwk: 0.6907
- Mse: 0.3751
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
---|---|---|---|---|---|
No log | 0.0308 | 2 | 1.2242 | 0.0 | 1.2224 |
No log | 0.0615 | 4 | 0.9994 | 0.1070 | 0.9981 |
No log | 0.0923 | 6 | 0.8925 | 0.0081 | 0.8916 |
No log | 0.1231 | 8 | 0.7646 | 0.0081 | 0.7641 |
No log | 0.1538 | 10 | 0.7245 | 0.0 | 0.7243 |
No log | 0.1846 | 12 | 0.7079 | 0.0053 | 0.7078 |
No log | 0.2154 | 14 | 0.6645 | 0.0131 | 0.6643 |
No log | 0.2462 | 16 | 0.5944 | 0.0234 | 0.5942 |
No log | 0.2769 | 18 | 0.5628 | 0.1223 | 0.5624 |
No log | 0.3077 | 20 | 0.5125 | 0.1772 | 0.5121 |
No log | 0.3385 | 22 | 0.4632 | 0.3805 | 0.4629 |
No log | 0.3692 | 24 | 0.5232 | 0.5661 | 0.5231 |
No log | 0.4 | 26 | 0.5584 | 0.5759 | 0.5583 |
No log | 0.4308 | 28 | 0.4575 | 0.6357 | 0.4573 |
No log | 0.4615 | 30 | 0.3987 | 0.4827 | 0.3982 |
No log | 0.4923 | 32 | 0.4462 | 0.3869 | 0.4456 |
No log | 0.5231 | 34 | 0.3838 | 0.5019 | 0.3835 |
No log | 0.5538 | 36 | 0.4091 | 0.6159 | 0.4089 |
No log | 0.5846 | 38 | 0.4442 | 0.6436 | 0.4440 |
No log | 0.6154 | 40 | 0.4124 | 0.6214 | 0.4121 |
No log | 0.6462 | 42 | 0.3823 | 0.5072 | 0.3820 |
No log | 0.6769 | 44 | 0.3787 | 0.4706 | 0.3784 |
No log | 0.7077 | 46 | 0.3722 | 0.5682 | 0.3719 |
No log | 0.7385 | 48 | 0.3676 | 0.6364 | 0.3673 |
No log | 0.7692 | 50 | 0.4047 | 0.6767 | 0.4045 |
No log | 0.8 | 52 | 0.4091 | 0.6848 | 0.4089 |
No log | 0.8308 | 54 | 0.3401 | 0.6635 | 0.3399 |
No log | 0.8615 | 56 | 0.3391 | 0.6720 | 0.3389 |
No log | 0.8923 | 58 | 0.3545 | 0.6704 | 0.3543 |
No log | 0.9231 | 60 | 0.3525 | 0.6816 | 0.3522 |
No log | 0.9538 | 62 | 0.3367 | 0.6683 | 0.3364 |
No log | 0.9846 | 64 | 0.3287 | 0.6747 | 0.3284 |
No log | 1.0154 | 66 | 0.3196 | 0.6693 | 0.3193 |
No log | 1.0462 | 68 | 0.3390 | 0.6772 | 0.3386 |
No log | 1.0769 | 70 | 0.3883 | 0.6838 | 0.3879 |
No log | 1.1077 | 72 | 0.4160 | 0.6845 | 0.4157 |
No log | 1.1385 | 74 | 0.3657 | 0.6780 | 0.3653 |
No log | 1.1692 | 76 | 0.3606 | 0.6748 | 0.3602 |
No log | 1.2 | 78 | 0.3383 | 0.6823 | 0.3379 |
No log | 1.2308 | 80 | 0.3333 | 0.6679 | 0.3330 |
No log | 1.2615 | 82 | 0.3474 | 0.6779 | 0.3472 |
No log | 1.2923 | 84 | 0.3342 | 0.6699 | 0.3339 |
No log | 1.3231 | 86 | 0.3209 | 0.6659 | 0.3206 |
No log | 1.3538 | 88 | 0.3328 | 0.6880 | 0.3324 |
No log | 1.3846 | 90 | 0.3587 | 0.6683 | 0.3583 |
No log | 1.4154 | 92 | 0.3319 | 0.6775 | 0.3315 |
No log | 1.4462 | 94 | 0.3085 | 0.6781 | 0.3081 |
No log | 1.4769 | 96 | 0.3201 | 0.6794 | 0.3196 |
No log | 1.5077 | 98 | 0.3097 | 0.6634 | 0.3093 |
No log | 1.5385 | 100 | 0.3188 | 0.6803 | 0.3184 |
No log | 1.5692 | 102 | 0.3422 | 0.6848 | 0.3417 |
No log | 1.6 | 104 | 0.3741 | 0.6911 | 0.3737 |
No log | 1.6308 | 106 | 0.3292 | 0.6852 | 0.3288 |
No log | 1.6615 | 108 | 0.3160 | 0.6812 | 0.3156 |
No log | 1.6923 | 110 | 0.3606 | 0.6927 | 0.3602 |
No log | 1.7231 | 112 | 0.3580 | 0.6919 | 0.3577 |
No log | 1.7538 | 114 | 0.3592 | 0.6905 | 0.3588 |
No log | 1.7846 | 116 | 0.3768 | 0.7016 | 0.3764 |
No log | 1.8154 | 118 | 0.3660 | 0.6905 | 0.3656 |
No log | 1.8462 | 120 | 0.3259 | 0.6639 | 0.3255 |
No log | 1.8769 | 122 | 0.3200 | 0.6024 | 0.3195 |
No log | 1.9077 | 124 | 0.3308 | 0.6202 | 0.3304 |
No log | 1.9385 | 126 | 0.3786 | 0.6774 | 0.3782 |
No log | 1.9692 | 128 | 0.4478 | 0.6714 | 0.4475 |
No log | 2.0 | 130 | 0.4483 | 0.6729 | 0.4480 |
No log | 2.0308 | 132 | 0.3755 | 0.6796 | 0.3752 |
No log | 2.0615 | 134 | 0.3187 | 0.6503 | 0.3184 |
No log | 2.0923 | 136 | 0.3057 | 0.6315 | 0.3054 |
No log | 2.1231 | 138 | 0.3050 | 0.6006 | 0.3047 |
No log | 2.1538 | 140 | 0.2987 | 0.6469 | 0.2984 |
No log | 2.1846 | 142 | 0.3535 | 0.6795 | 0.3532 |
No log | 2.2154 | 144 | 0.4901 | 0.6855 | 0.4899 |
No log | 2.2462 | 146 | 0.5180 | 0.6793 | 0.5178 |
No log | 2.2769 | 148 | 0.4216 | 0.6863 | 0.4214 |
No log | 2.3077 | 150 | 0.3125 | 0.6716 | 0.3122 |
No log | 2.3385 | 152 | 0.3028 | 0.6320 | 0.3025 |
No log | 2.3692 | 154 | 0.3038 | 0.6614 | 0.3035 |
No log | 2.4 | 156 | 0.3478 | 0.6844 | 0.3476 |
No log | 2.4308 | 158 | 0.4110 | 0.6824 | 0.4107 |
No log | 2.4615 | 160 | 0.4052 | 0.6829 | 0.4050 |
No log | 2.4923 | 162 | 0.3788 | 0.6831 | 0.3785 |
No log | 2.5231 | 164 | 0.3302 | 0.6933 | 0.3299 |
No log | 2.5538 | 166 | 0.3033 | 0.6664 | 0.3030 |
No log | 2.5846 | 168 | 0.3146 | 0.6732 | 0.3143 |
No log | 2.6154 | 170 | 0.3847 | 0.6901 | 0.3844 |
No log | 2.6462 | 172 | 0.5298 | 0.7038 | 0.5295 |
No log | 2.6769 | 174 | 0.5863 | 0.6870 | 0.5860 |
No log | 2.7077 | 176 | 0.5109 | 0.7025 | 0.5106 |
No log | 2.7385 | 178 | 0.3761 | 0.6916 | 0.3757 |
No log | 2.7692 | 180 | 0.3138 | 0.6540 | 0.3135 |
No log | 2.8 | 182 | 0.3097 | 0.6491 | 0.3093 |
No log | 2.8308 | 184 | 0.3323 | 0.6756 | 0.3319 |
No log | 2.8615 | 186 | 0.3514 | 0.6914 | 0.3510 |
No log | 2.8923 | 188 | 0.3459 | 0.6780 | 0.3456 |
No log | 2.9231 | 190 | 0.3481 | 0.6898 | 0.3478 |
No log | 2.9538 | 192 | 0.3457 | 0.6902 | 0.3453 |
No log | 2.9846 | 194 | 0.3362 | 0.6855 | 0.3359 |
No log | 3.0154 | 196 | 0.3415 | 0.6766 | 0.3411 |
No log | 3.0462 | 198 | 0.3545 | 0.6832 | 0.3542 |
No log | 3.0769 | 200 | 0.3730 | 0.6756 | 0.3727 |
No log | 3.1077 | 202 | 0.3424 | 0.6790 | 0.3421 |
No log | 3.1385 | 204 | 0.3321 | 0.6623 | 0.3318 |
No log | 3.1692 | 206 | 0.3539 | 0.6765 | 0.3536 |
No log | 3.2 | 208 | 0.3660 | 0.6779 | 0.3657 |
No log | 3.2308 | 210 | 0.3551 | 0.6668 | 0.3547 |
No log | 3.2615 | 212 | 0.3455 | 0.6723 | 0.3452 |
No log | 3.2923 | 214 | 0.3780 | 0.6753 | 0.3777 |
No log | 3.3231 | 216 | 0.3938 | 0.6787 | 0.3935 |
No log | 3.3538 | 218 | 0.4286 | 0.6828 | 0.4283 |
No log | 3.3846 | 220 | 0.4587 | 0.7046 | 0.4585 |
No log | 3.4154 | 222 | 0.4166 | 0.6859 | 0.4164 |
No log | 3.4462 | 224 | 0.3735 | 0.6772 | 0.3732 |
No log | 3.4769 | 226 | 0.3867 | 0.6964 | 0.3864 |
No log | 3.5077 | 228 | 0.3875 | 0.6986 | 0.3872 |
No log | 3.5385 | 230 | 0.4188 | 0.7029 | 0.4184 |
No log | 3.5692 | 232 | 0.4099 | 0.7010 | 0.4095 |
No log | 3.6 | 234 | 0.3635 | 0.6977 | 0.3632 |
No log | 3.6308 | 236 | 0.3266 | 0.6611 | 0.3262 |
No log | 3.6615 | 238 | 0.3292 | 0.6679 | 0.3289 |
No log | 3.6923 | 240 | 0.3674 | 0.6886 | 0.3671 |
No log | 3.7231 | 242 | 0.4326 | 0.7049 | 0.4323 |
No log | 3.7538 | 244 | 0.4437 | 0.7103 | 0.4434 |
No log | 3.7846 | 246 | 0.4098 | 0.6937 | 0.4095 |
No log | 3.8154 | 248 | 0.3575 | 0.6741 | 0.3572 |
No log | 3.8462 | 250 | 0.3383 | 0.6513 | 0.3380 |
No log | 3.8769 | 252 | 0.3392 | 0.6597 | 0.3389 |
No log | 3.9077 | 254 | 0.3602 | 0.6735 | 0.3599 |
No log | 3.9385 | 256 | 0.3845 | 0.6878 | 0.3842 |
No log | 3.9692 | 258 | 0.4114 | 0.6847 | 0.4111 |
No log | 4.0 | 260 | 0.4175 | 0.6867 | 0.4172 |
No log | 4.0308 | 262 | 0.4228 | 0.6914 | 0.4225 |
No log | 4.0615 | 264 | 0.3927 | 0.6878 | 0.3924 |
No log | 4.0923 | 266 | 0.3653 | 0.6754 | 0.3650 |
No log | 4.1231 | 268 | 0.3506 | 0.6733 | 0.3502 |
No log | 4.1538 | 270 | 0.3375 | 0.6655 | 0.3372 |
No log | 4.1846 | 272 | 0.3434 | 0.6757 | 0.3431 |
No log | 4.2154 | 274 | 0.3498 | 0.6776 | 0.3495 |
No log | 4.2462 | 276 | 0.3708 | 0.6879 | 0.3705 |
No log | 4.2769 | 278 | 0.3986 | 0.6930 | 0.3983 |
No log | 4.3077 | 280 | 0.4160 | 0.7010 | 0.4156 |
No log | 4.3385 | 282 | 0.4092 | 0.6916 | 0.4089 |
No log | 4.3692 | 284 | 0.3913 | 0.6915 | 0.3910 |
No log | 4.4 | 286 | 0.3804 | 0.6846 | 0.3801 |
No log | 4.4308 | 288 | 0.3681 | 0.6801 | 0.3677 |
No log | 4.4615 | 290 | 0.3650 | 0.6782 | 0.3647 |
No log | 4.4923 | 292 | 0.3708 | 0.6789 | 0.3705 |
No log | 4.5231 | 294 | 0.3927 | 0.6859 | 0.3923 |
No log | 4.5538 | 296 | 0.4218 | 0.7032 | 0.4215 |
No log | 4.5846 | 298 | 0.4256 | 0.7120 | 0.4253 |
No log | 4.6154 | 300 | 0.4196 | 0.7025 | 0.4192 |
No log | 4.6462 | 302 | 0.4135 | 0.7006 | 0.4131 |
No log | 4.6769 | 304 | 0.4006 | 0.6976 | 0.4002 |
No log | 4.7077 | 306 | 0.3885 | 0.7019 | 0.3881 |
No log | 4.7385 | 308 | 0.3805 | 0.6907 | 0.3802 |
No log | 4.7692 | 310 | 0.3689 | 0.6900 | 0.3686 |
No log | 4.8 | 312 | 0.3638 | 0.6922 | 0.3635 |
No log | 4.8308 | 314 | 0.3640 | 0.6940 | 0.3637 |
No log | 4.8615 | 316 | 0.3680 | 0.6907 | 0.3676 |
No log | 4.8923 | 318 | 0.3709 | 0.6888 | 0.3705 |
No log | 4.9231 | 320 | 0.3737 | 0.6907 | 0.3733 |
No log | 4.9538 | 322 | 0.3749 | 0.6907 | 0.3746 |
No log | 4.9846 | 324 | 0.3754 | 0.6907 | 0.3751 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 5
Model tree for salbatarni/bert_baseline_language_task3_fold0
Base model
google-bert/bert-base-cased