Edit model card

bert_baseline_prompt_adherence_task4_fold0

This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3003
  • Qwk: 0.7143
  • Mse: 0.3003

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0299 2 0.7952 0.0 0.7952
No log 0.0597 4 0.7435 0.3575 0.7435
No log 0.0896 6 0.7053 0.3949 0.7053
No log 0.1194 8 0.6544 0.4134 0.6544
No log 0.1493 10 0.5985 0.3962 0.5985
No log 0.1791 12 0.5456 0.3886 0.5456
No log 0.2090 14 0.4624 0.4085 0.4624
No log 0.2388 16 0.4188 0.4256 0.4188
No log 0.2687 18 0.4127 0.4472 0.4127
No log 0.2985 20 0.4097 0.5714 0.4097
No log 0.3284 22 0.4263 0.5304 0.4263
No log 0.3582 24 0.4263 0.6013 0.4263
No log 0.3881 26 0.4291 0.5266 0.4291
No log 0.4179 28 0.4956 0.3821 0.4956
No log 0.4478 30 0.6276 0.2358 0.6276
No log 0.4776 32 0.5191 0.3424 0.5191
No log 0.5075 34 0.3886 0.5796 0.3886
No log 0.5373 36 0.4471 0.6587 0.4471
No log 0.5672 38 0.5216 0.6841 0.5216
No log 0.5970 40 0.4964 0.6996 0.4964
No log 0.6269 42 0.3933 0.6059 0.3933
No log 0.6567 44 0.4005 0.4346 0.4005
No log 0.6866 46 0.4494 0.3723 0.4494
No log 0.7164 48 0.3869 0.4336 0.3869
No log 0.7463 50 0.3519 0.5492 0.3519
No log 0.7761 52 0.4054 0.6456 0.4054
No log 0.8060 54 0.4709 0.7091 0.4709
No log 0.8358 56 0.4444 0.6518 0.4444
No log 0.8657 58 0.3970 0.4904 0.3970
No log 0.8955 60 0.3593 0.5019 0.3593
No log 0.9254 62 0.3568 0.4829 0.3568
No log 0.9552 64 0.3636 0.4542 0.3636
No log 0.9851 66 0.3443 0.5208 0.3443
No log 1.0149 68 0.3360 0.5808 0.3360
No log 1.0448 70 0.3498 0.6348 0.3498
No log 1.0746 72 0.3605 0.6239 0.3605
No log 1.1045 74 0.3729 0.6362 0.3729
No log 1.1343 76 0.3536 0.6271 0.3536
No log 1.1642 78 0.3291 0.6195 0.3291
No log 1.1940 80 0.3167 0.6019 0.3167
No log 1.2239 82 0.3308 0.5436 0.3308
No log 1.2537 84 0.3383 0.5374 0.3383
No log 1.2836 86 0.3108 0.6019 0.3108
No log 1.3134 88 0.3449 0.6889 0.3449
No log 1.3433 90 0.4016 0.7462 0.4016
No log 1.3731 92 0.3671 0.7226 0.3671
No log 1.4030 94 0.3511 0.7169 0.3511
No log 1.4328 96 0.3221 0.6724 0.3221
No log 1.4627 98 0.2964 0.6058 0.2964
No log 1.4925 100 0.2981 0.5979 0.2981
No log 1.5224 102 0.3050 0.6132 0.3050
No log 1.5522 104 0.3045 0.6214 0.3045
No log 1.5821 106 0.3029 0.6459 0.3029
No log 1.6119 108 0.3022 0.6669 0.3022
No log 1.6418 110 0.2957 0.6397 0.2957
No log 1.6716 112 0.2978 0.6661 0.2978
No log 1.7015 114 0.3018 0.6485 0.3018
No log 1.7313 116 0.3019 0.6521 0.3019
No log 1.7612 118 0.3064 0.6531 0.3064
No log 1.7910 120 0.3164 0.6743 0.3164
No log 1.8209 122 0.3017 0.6465 0.3017
No log 1.8507 124 0.2939 0.6006 0.2939
No log 1.8806 126 0.2929 0.5983 0.2929
No log 1.9104 128 0.2938 0.5894 0.2938
No log 1.9403 130 0.3015 0.6450 0.3015
No log 1.9701 132 0.3092 0.6608 0.3092
No log 2.0 134 0.3040 0.6349 0.3040
No log 2.0299 136 0.3100 0.6394 0.3100
No log 2.0597 138 0.3164 0.6438 0.3164
No log 2.0896 140 0.3273 0.6548 0.3273
No log 2.1194 142 0.3369 0.6855 0.3369
No log 2.1493 144 0.3592 0.7197 0.3592
No log 2.1791 146 0.3491 0.7110 0.3491
No log 2.2090 148 0.3040 0.6737 0.3040
No log 2.2388 150 0.2965 0.6074 0.2965
No log 2.2687 152 0.2892 0.6008 0.2892
No log 2.2985 154 0.2845 0.6733 0.2845
No log 2.3284 156 0.3246 0.7050 0.3246
No log 2.3582 158 0.3523 0.7290 0.3523
No log 2.3881 160 0.3397 0.7298 0.3397
No log 2.4179 162 0.2967 0.6931 0.2967
No log 2.4478 164 0.2785 0.6576 0.2785
No log 2.4776 166 0.2796 0.6046 0.2796
No log 2.5075 168 0.2756 0.6642 0.2756
No log 2.5373 170 0.2917 0.6973 0.2917
No log 2.5672 172 0.3035 0.6955 0.3035
No log 2.5970 174 0.2965 0.6872 0.2965
No log 2.6269 176 0.2904 0.6676 0.2904
No log 2.6567 178 0.2918 0.6502 0.2918
No log 2.6866 180 0.3002 0.6610 0.3002
No log 2.7164 182 0.3036 0.6775 0.3036
No log 2.7463 184 0.3151 0.6946 0.3151
No log 2.7761 186 0.3025 0.6801 0.3025
No log 2.8060 188 0.3016 0.6816 0.3016
No log 2.8358 190 0.2948 0.6867 0.2948
No log 2.8657 192 0.2884 0.6840 0.2884
No log 2.8955 194 0.2901 0.6984 0.2901
No log 2.9254 196 0.3161 0.7266 0.3161
No log 2.9552 198 0.3314 0.7278 0.3314
No log 2.9851 200 0.3599 0.7395 0.3599
No log 3.0149 202 0.3500 0.7368 0.3500
No log 3.0448 204 0.3247 0.7195 0.3247
No log 3.0746 206 0.3046 0.7171 0.3046
No log 3.1045 208 0.2992 0.7107 0.2992
No log 3.1343 210 0.2817 0.7049 0.2817
No log 3.1642 212 0.2759 0.6988 0.2759
No log 3.1940 214 0.2821 0.7083 0.2821
No log 3.2239 216 0.2785 0.6974 0.2785
No log 3.2537 218 0.2878 0.7076 0.2878
No log 3.2836 220 0.2989 0.7212 0.2989
No log 3.3134 222 0.3215 0.7382 0.3215
No log 3.3433 224 0.3473 0.7553 0.3473
No log 3.3731 226 0.3441 0.7540 0.3441
No log 3.4030 228 0.3073 0.7378 0.3073
No log 3.4328 230 0.2796 0.7068 0.2796
No log 3.4627 232 0.2775 0.6960 0.2775
No log 3.4925 234 0.2900 0.7158 0.2900
No log 3.5224 236 0.3353 0.7499 0.3353
No log 3.5522 238 0.4133 0.7759 0.4133
No log 3.5821 240 0.4417 0.7765 0.4417
No log 3.6119 242 0.4043 0.7724 0.4043
No log 3.6418 244 0.3376 0.7508 0.3376
No log 3.6716 246 0.3091 0.7164 0.3091
No log 3.7015 248 0.2933 0.6959 0.2933
No log 3.7313 250 0.2880 0.6747 0.2880
No log 3.7612 252 0.2898 0.6758 0.2898
No log 3.7910 254 0.2904 0.6816 0.2904
No log 3.8209 256 0.2909 0.7010 0.2909
No log 3.8507 258 0.2929 0.7023 0.2929
No log 3.8806 260 0.3070 0.7026 0.3070
No log 3.9104 262 0.3139 0.7200 0.3139
No log 3.9403 264 0.3030 0.7028 0.3030
No log 3.9701 266 0.2921 0.7048 0.2921
No log 4.0 268 0.2786 0.6411 0.2786
No log 4.0299 270 0.2752 0.6357 0.2752
No log 4.0597 272 0.2739 0.6296 0.2739
No log 4.0896 274 0.2724 0.6530 0.2724
No log 4.1194 276 0.2777 0.6781 0.2777
No log 4.1493 278 0.2886 0.7044 0.2886
No log 4.1791 280 0.3133 0.7272 0.3133
No log 4.2090 282 0.3412 0.7456 0.3412
No log 4.2388 284 0.3478 0.7527 0.3478
No log 4.2687 286 0.3332 0.7504 0.3332
No log 4.2985 288 0.3087 0.7299 0.3087
No log 4.3284 290 0.2869 0.7150 0.2869
No log 4.3582 292 0.2768 0.6985 0.2768
No log 4.3881 294 0.2740 0.6927 0.2740
No log 4.4179 296 0.2755 0.6952 0.2755
No log 4.4478 298 0.2784 0.6986 0.2784
No log 4.4776 300 0.2836 0.7007 0.2836
No log 4.5075 302 0.2915 0.7084 0.2915
No log 4.5373 304 0.3019 0.7267 0.3019
No log 4.5672 306 0.3137 0.7308 0.3137
No log 4.5970 308 0.3236 0.7425 0.3236
No log 4.6269 310 0.3293 0.7447 0.3293
No log 4.6567 312 0.3275 0.7447 0.3275
No log 4.6866 314 0.3257 0.7426 0.3257
No log 4.7164 316 0.3203 0.7425 0.3203
No log 4.7463 318 0.3120 0.7280 0.3120
No log 4.7761 320 0.3042 0.7199 0.3042
No log 4.8060 322 0.2997 0.7143 0.2997
No log 4.8358 324 0.2988 0.7143 0.2988
No log 4.8657 326 0.2990 0.7143 0.2990
No log 4.8955 328 0.3002 0.7143 0.3002
No log 4.9254 330 0.3006 0.7143 0.3006
No log 4.9552 332 0.3004 0.7143 0.3004
No log 4.9851 334 0.3003 0.7143 0.3003

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
108M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for salbatarni/bert_baseline_prompt_adherence_task4_fold0

Finetuned
(1906)
this model