metadata
license: apache-2.0
base_model: google-bert/bert-base-cased
tags:
- generated_from_trainer
model-index:
- name: bert_baseline_prompt_adherence_task4_fold3
results: []
bert_baseline_prompt_adherence_task4_fold3
This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3007
- Qwk: 0.7134
- Mse: 0.3007
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
---|---|---|---|---|---|
No log | 0.0299 | 2 | 1.1652 | 0.0 | 1.1652 |
No log | 0.0597 | 4 | 0.8846 | 0.0 | 0.8846 |
No log | 0.0896 | 6 | 0.8201 | 0.0410 | 0.8201 |
No log | 0.1194 | 8 | 0.7309 | 0.3544 | 0.7309 |
No log | 0.1493 | 10 | 0.7578 | 0.3105 | 0.7578 |
No log | 0.1791 | 12 | 0.7143 | 0.3280 | 0.7143 |
No log | 0.2090 | 14 | 0.6343 | 0.3423 | 0.6343 |
No log | 0.2388 | 16 | 0.5999 | 0.3575 | 0.5999 |
No log | 0.2687 | 18 | 0.5471 | 0.3610 | 0.5471 |
No log | 0.2985 | 20 | 0.5081 | 0.3822 | 0.5081 |
No log | 0.3284 | 22 | 0.4839 | 0.3950 | 0.4839 |
No log | 0.3582 | 24 | 0.4767 | 0.4624 | 0.4767 |
No log | 0.3881 | 26 | 0.4623 | 0.4753 | 0.4623 |
No log | 0.4179 | 28 | 0.4515 | 0.5443 | 0.4515 |
No log | 0.4478 | 30 | 0.4418 | 0.5861 | 0.4418 |
No log | 0.4776 | 32 | 0.4497 | 0.6372 | 0.4497 |
No log | 0.5075 | 34 | 0.4684 | 0.6804 | 0.4684 |
No log | 0.5373 | 36 | 0.4103 | 0.5638 | 0.4103 |
No log | 0.5672 | 38 | 0.4692 | 0.4278 | 0.4692 |
No log | 0.5970 | 40 | 0.4385 | 0.4702 | 0.4385 |
No log | 0.6269 | 42 | 0.4057 | 0.6059 | 0.4057 |
No log | 0.6567 | 44 | 0.4075 | 0.6238 | 0.4075 |
No log | 0.6866 | 46 | 0.4234 | 0.6279 | 0.4234 |
No log | 0.7164 | 48 | 0.4115 | 0.6273 | 0.4115 |
No log | 0.7463 | 50 | 0.4058 | 0.6275 | 0.4058 |
No log | 0.7761 | 52 | 0.4300 | 0.5914 | 0.4300 |
No log | 0.8060 | 54 | 0.4332 | 0.6025 | 0.4332 |
No log | 0.8358 | 56 | 0.4235 | 0.6677 | 0.4235 |
No log | 0.8657 | 58 | 0.4115 | 0.7315 | 0.4115 |
No log | 0.8955 | 60 | 0.4135 | 0.7072 | 0.4135 |
No log | 0.9254 | 62 | 0.3817 | 0.6369 | 0.3817 |
No log | 0.9552 | 64 | 0.3606 | 0.5386 | 0.3606 |
No log | 0.9851 | 66 | 0.3687 | 0.4897 | 0.3687 |
No log | 1.0149 | 68 | 0.3556 | 0.5853 | 0.3556 |
No log | 1.0448 | 70 | 0.5049 | 0.7415 | 0.5049 |
No log | 1.0746 | 72 | 0.6769 | 0.7072 | 0.6769 |
No log | 1.1045 | 74 | 0.6039 | 0.7261 | 0.6039 |
No log | 1.1343 | 76 | 0.4060 | 0.7198 | 0.4060 |
No log | 1.1642 | 78 | 0.3443 | 0.6533 | 0.3443 |
No log | 1.1940 | 80 | 0.3495 | 0.5565 | 0.3495 |
No log | 1.2239 | 82 | 0.3811 | 0.5224 | 0.3811 |
No log | 1.2537 | 84 | 0.3542 | 0.5846 | 0.3542 |
No log | 1.2836 | 86 | 0.4255 | 0.7504 | 0.4255 |
No log | 1.3134 | 88 | 0.6892 | 0.7008 | 0.6892 |
No log | 1.3433 | 90 | 0.8083 | 0.6679 | 0.8083 |
No log | 1.3731 | 92 | 0.6980 | 0.7011 | 0.6980 |
No log | 1.4030 | 94 | 0.4612 | 0.7265 | 0.4612 |
No log | 1.4328 | 96 | 0.3490 | 0.5371 | 0.3490 |
No log | 1.4627 | 98 | 0.5146 | 0.3705 | 0.5146 |
No log | 1.4925 | 100 | 0.5892 | 0.3434 | 0.5892 |
No log | 1.5224 | 102 | 0.4912 | 0.3726 | 0.4912 |
No log | 1.5522 | 104 | 0.3555 | 0.5078 | 0.3555 |
No log | 1.5821 | 106 | 0.3535 | 0.6587 | 0.3535 |
No log | 1.6119 | 108 | 0.4582 | 0.7396 | 0.4582 |
No log | 1.6418 | 110 | 0.4804 | 0.7252 | 0.4804 |
No log | 1.6716 | 112 | 0.4264 | 0.7211 | 0.4264 |
No log | 1.7015 | 114 | 0.3477 | 0.6178 | 0.3477 |
No log | 1.7313 | 116 | 0.3480 | 0.5050 | 0.3480 |
No log | 1.7612 | 118 | 0.3673 | 0.4892 | 0.3673 |
No log | 1.7910 | 120 | 0.3600 | 0.4946 | 0.3600 |
No log | 1.8209 | 122 | 0.3388 | 0.5381 | 0.3388 |
No log | 1.8507 | 124 | 0.3300 | 0.6645 | 0.3300 |
No log | 1.8806 | 126 | 0.3572 | 0.7312 | 0.3572 |
No log | 1.9104 | 128 | 0.3822 | 0.7526 | 0.3822 |
No log | 1.9403 | 130 | 0.4252 | 0.7575 | 0.4252 |
No log | 1.9701 | 132 | 0.4264 | 0.7640 | 0.4264 |
No log | 2.0 | 134 | 0.3658 | 0.7410 | 0.3658 |
No log | 2.0299 | 136 | 0.3297 | 0.7094 | 0.3297 |
No log | 2.0597 | 138 | 0.3264 | 0.7102 | 0.3264 |
No log | 2.0896 | 140 | 0.3522 | 0.7487 | 0.3522 |
No log | 2.1194 | 142 | 0.3914 | 0.7552 | 0.3914 |
No log | 2.1493 | 144 | 0.4335 | 0.7680 | 0.4335 |
No log | 2.1791 | 146 | 0.4439 | 0.7586 | 0.4439 |
No log | 2.2090 | 148 | 0.3692 | 0.7491 | 0.3692 |
No log | 2.2388 | 150 | 0.3258 | 0.6501 | 0.3258 |
No log | 2.2687 | 152 | 0.3523 | 0.6000 | 0.3523 |
No log | 2.2985 | 154 | 0.3407 | 0.6159 | 0.3407 |
No log | 2.3284 | 156 | 0.3342 | 0.6202 | 0.3342 |
No log | 2.3582 | 158 | 0.3294 | 0.7100 | 0.3294 |
No log | 2.3881 | 160 | 0.3491 | 0.7501 | 0.3491 |
No log | 2.4179 | 162 | 0.3731 | 0.7590 | 0.3731 |
No log | 2.4478 | 164 | 0.3408 | 0.7523 | 0.3408 |
No log | 2.4776 | 166 | 0.3112 | 0.6943 | 0.3112 |
No log | 2.5075 | 168 | 0.3135 | 0.7184 | 0.3135 |
No log | 2.5373 | 170 | 0.3134 | 0.7015 | 0.3134 |
No log | 2.5672 | 172 | 0.3262 | 0.7419 | 0.3262 |
No log | 2.5970 | 174 | 0.3347 | 0.7282 | 0.3347 |
No log | 2.6269 | 176 | 0.3192 | 0.6942 | 0.3192 |
No log | 2.6567 | 178 | 0.3078 | 0.6348 | 0.3078 |
No log | 2.6866 | 180 | 0.3097 | 0.6600 | 0.3097 |
No log | 2.7164 | 182 | 0.3294 | 0.6841 | 0.3294 |
No log | 2.7463 | 184 | 0.3479 | 0.7260 | 0.3479 |
No log | 2.7761 | 186 | 0.3401 | 0.7118 | 0.3401 |
No log | 2.8060 | 188 | 0.3360 | 0.7117 | 0.3360 |
No log | 2.8358 | 190 | 0.3056 | 0.6730 | 0.3056 |
No log | 2.8657 | 192 | 0.2965 | 0.6508 | 0.2965 |
No log | 2.8955 | 194 | 0.2960 | 0.6731 | 0.2960 |
No log | 2.9254 | 196 | 0.3068 | 0.7254 | 0.3068 |
No log | 2.9552 | 198 | 0.3087 | 0.7405 | 0.3087 |
No log | 2.9851 | 200 | 0.3304 | 0.7504 | 0.3304 |
No log | 3.0149 | 202 | 0.3751 | 0.7815 | 0.3751 |
No log | 3.0448 | 204 | 0.3574 | 0.7730 | 0.3574 |
No log | 3.0746 | 206 | 0.3088 | 0.7355 | 0.3088 |
No log | 3.1045 | 208 | 0.2871 | 0.6991 | 0.2871 |
No log | 3.1343 | 210 | 0.2886 | 0.6702 | 0.2886 |
No log | 3.1642 | 212 | 0.2844 | 0.7040 | 0.2844 |
No log | 3.1940 | 214 | 0.3033 | 0.7327 | 0.3033 |
No log | 3.2239 | 216 | 0.3049 | 0.7362 | 0.3049 |
No log | 3.2537 | 218 | 0.2916 | 0.6614 | 0.2916 |
No log | 3.2836 | 220 | 0.2944 | 0.6488 | 0.2944 |
No log | 3.3134 | 222 | 0.2985 | 0.6593 | 0.2985 |
No log | 3.3433 | 224 | 0.2998 | 0.6534 | 0.2998 |
No log | 3.3731 | 226 | 0.3006 | 0.6674 | 0.3006 |
No log | 3.4030 | 228 | 0.3110 | 0.7123 | 0.3110 |
No log | 3.4328 | 230 | 0.3410 | 0.7500 | 0.3410 |
No log | 3.4627 | 232 | 0.3420 | 0.7495 | 0.3420 |
No log | 3.4925 | 234 | 0.3212 | 0.7474 | 0.3212 |
No log | 3.5224 | 236 | 0.3012 | 0.7157 | 0.3012 |
No log | 3.5522 | 238 | 0.2985 | 0.7168 | 0.2985 |
No log | 3.5821 | 240 | 0.3101 | 0.7502 | 0.3101 |
No log | 3.6119 | 242 | 0.3348 | 0.7567 | 0.3348 |
No log | 3.6418 | 244 | 0.3399 | 0.7589 | 0.3399 |
No log | 3.6716 | 246 | 0.3245 | 0.7547 | 0.3245 |
No log | 3.7015 | 248 | 0.3113 | 0.7515 | 0.3113 |
No log | 3.7313 | 250 | 0.2960 | 0.7236 | 0.2960 |
No log | 3.7612 | 252 | 0.2962 | 0.7293 | 0.2962 |
No log | 3.7910 | 254 | 0.3069 | 0.7442 | 0.3069 |
No log | 3.8209 | 256 | 0.3069 | 0.7433 | 0.3069 |
No log | 3.8507 | 258 | 0.2979 | 0.7457 | 0.2979 |
No log | 3.8806 | 260 | 0.2953 | 0.7366 | 0.2953 |
No log | 3.9104 | 262 | 0.2937 | 0.7168 | 0.2937 |
No log | 3.9403 | 264 | 0.2930 | 0.6800 | 0.2930 |
No log | 3.9701 | 266 | 0.2948 | 0.6522 | 0.2948 |
No log | 4.0 | 268 | 0.2976 | 0.6603 | 0.2976 |
No log | 4.0299 | 270 | 0.3080 | 0.7057 | 0.3080 |
No log | 4.0597 | 272 | 0.3385 | 0.7483 | 0.3385 |
No log | 4.0896 | 274 | 0.3662 | 0.7436 | 0.3662 |
No log | 4.1194 | 276 | 0.3628 | 0.7465 | 0.3628 |
No log | 4.1493 | 278 | 0.3366 | 0.7420 | 0.3366 |
No log | 4.1791 | 280 | 0.3211 | 0.7371 | 0.3211 |
No log | 4.2090 | 282 | 0.3076 | 0.7307 | 0.3076 |
No log | 4.2388 | 284 | 0.3029 | 0.6986 | 0.3029 |
No log | 4.2687 | 286 | 0.3028 | 0.6533 | 0.3028 |
No log | 4.2985 | 288 | 0.3025 | 0.6613 | 0.3025 |
No log | 4.3284 | 290 | 0.3052 | 0.7293 | 0.3052 |
No log | 4.3582 | 292 | 0.3108 | 0.7444 | 0.3108 |
No log | 4.3881 | 294 | 0.3108 | 0.7444 | 0.3108 |
No log | 4.4179 | 296 | 0.3085 | 0.7431 | 0.3085 |
No log | 4.4478 | 298 | 0.3061 | 0.7384 | 0.3061 |
No log | 4.4776 | 300 | 0.3071 | 0.7357 | 0.3071 |
No log | 4.5075 | 302 | 0.3097 | 0.7392 | 0.3097 |
No log | 4.5373 | 304 | 0.3115 | 0.7464 | 0.3115 |
No log | 4.5672 | 306 | 0.3114 | 0.7477 | 0.3114 |
No log | 4.5970 | 308 | 0.3072 | 0.7332 | 0.3072 |
No log | 4.6269 | 310 | 0.3050 | 0.7310 | 0.3050 |
No log | 4.6567 | 312 | 0.3040 | 0.7358 | 0.3040 |
No log | 4.6866 | 314 | 0.3036 | 0.7311 | 0.3036 |
No log | 4.7164 | 316 | 0.3050 | 0.7392 | 0.3050 |
No log | 4.7463 | 318 | 0.3053 | 0.7440 | 0.3053 |
No log | 4.7761 | 320 | 0.3055 | 0.7440 | 0.3055 |
No log | 4.8060 | 322 | 0.3037 | 0.7311 | 0.3037 |
No log | 4.8358 | 324 | 0.3023 | 0.7286 | 0.3023 |
No log | 4.8657 | 326 | 0.3017 | 0.7284 | 0.3017 |
No log | 4.8955 | 328 | 0.3009 | 0.7134 | 0.3009 |
No log | 4.9254 | 330 | 0.3007 | 0.7134 | 0.3007 |
No log | 4.9552 | 332 | 0.3006 | 0.7134 | 0.3006 |
No log | 4.9851 | 334 | 0.3007 | 0.7134 | 0.3007 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1