salbatarni's picture
End of training
2c5c3d6 verified
metadata
license: apache-2.0
base_model: google-bert/bert-base-cased
tags:
  - generated_from_trainer
model-index:
  - name: bert_baseline_prompt_adherence_task6_fold0
    results: []

bert_baseline_prompt_adherence_task6_fold0

This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3214
  • Qwk: 0.7303
  • Mse: 0.3214

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0294 2 1.9024 0.0 1.9024
No log 0.0588 4 1.6433 -0.0376 1.6433
No log 0.0882 6 1.4580 -0.0106 1.4580
No log 0.1176 8 1.1646 0.0061 1.1646
No log 0.1471 10 0.9673 0.0061 0.9673
No log 0.1765 12 0.8529 0.0061 0.8529
No log 0.2059 14 0.8166 0.1658 0.8166
No log 0.2353 16 0.8014 0.0430 0.8014
No log 0.2647 18 0.7507 0.2430 0.7507
No log 0.2941 20 0.6900 0.3980 0.6900
No log 0.3235 22 0.6517 0.4566 0.6517
No log 0.3529 24 0.6213 0.4542 0.6213
No log 0.3824 26 0.5886 0.3707 0.5886
No log 0.4118 28 0.5738 0.3586 0.5738
No log 0.4412 30 0.5467 0.4152 0.5467
No log 0.4706 32 0.5428 0.3809 0.5428
No log 0.5 34 0.4846 0.4818 0.4846
No log 0.5294 36 0.4315 0.4939 0.4315
No log 0.5588 38 0.4162 0.5244 0.4162
No log 0.5882 40 0.3789 0.5985 0.3789
No log 0.6176 42 0.3613 0.5956 0.3613
No log 0.6471 44 0.3505 0.5933 0.3505
No log 0.6765 46 0.3581 0.6419 0.3581
No log 0.7059 48 0.3340 0.6165 0.3340
No log 0.7353 50 0.3323 0.6190 0.3323
No log 0.7647 52 0.3927 0.7193 0.3927
No log 0.7941 54 0.4713 0.7380 0.4713
No log 0.8235 56 0.3768 0.6750 0.3768
No log 0.8529 58 0.3531 0.5832 0.3531
No log 0.8824 60 0.4043 0.5390 0.4043
No log 0.9118 62 0.3447 0.5850 0.3447
No log 0.9412 64 0.3618 0.6575 0.3618
No log 0.9706 66 0.3572 0.6539 0.3572
No log 1.0 68 0.3949 0.6765 0.3949
No log 1.0294 70 0.3686 0.6660 0.3686
No log 1.0588 72 0.3326 0.6180 0.3326
No log 1.0882 74 0.3520 0.5823 0.3520
No log 1.1176 76 0.3303 0.6063 0.3303
No log 1.1471 78 0.3475 0.6601 0.3475
No log 1.1765 80 0.3729 0.6752 0.3729
No log 1.2059 82 0.3223 0.6533 0.3223
No log 1.2353 84 0.3665 0.5588 0.3665
No log 1.2647 86 0.3824 0.5485 0.3824
No log 1.2941 88 0.3093 0.6230 0.3093
No log 1.3235 90 0.3160 0.6514 0.3160
No log 1.3529 92 0.3212 0.6588 0.3212
No log 1.3824 94 0.3075 0.6459 0.3075
No log 1.4118 96 0.3146 0.6141 0.3146
No log 1.4412 98 0.3140 0.6051 0.3140
No log 1.4706 100 0.2968 0.6409 0.2968
No log 1.5 102 0.3146 0.6665 0.3146
No log 1.5294 104 0.3225 0.6744 0.3225
No log 1.5588 106 0.2963 0.6660 0.2963
No log 1.5882 108 0.3015 0.6202 0.3015
No log 1.6176 110 0.3128 0.6101 0.3128
No log 1.6471 112 0.2930 0.6835 0.2930
No log 1.6765 114 0.3211 0.7509 0.3211
No log 1.7059 116 0.3024 0.7304 0.3024
No log 1.7353 118 0.2830 0.6659 0.2830
No log 1.7647 120 0.2853 0.6455 0.2853
No log 1.7941 122 0.2959 0.7087 0.2959
No log 1.8235 124 0.3210 0.7212 0.3210
No log 1.8529 126 0.3687 0.7455 0.3687
No log 1.8824 128 0.3281 0.7077 0.3281
No log 1.9118 130 0.2932 0.6235 0.2932
No log 1.9412 132 0.3188 0.5858 0.3188
No log 1.9706 134 0.3395 0.5668 0.3395
No log 2.0 136 0.3031 0.5998 0.3031
No log 2.0294 138 0.2965 0.6165 0.2965
No log 2.0588 140 0.2870 0.6407 0.2870
No log 2.0882 142 0.2971 0.6951 0.2971
No log 2.1176 144 0.3088 0.7183 0.3088
No log 2.1471 146 0.2953 0.6786 0.2953
No log 2.1765 148 0.3026 0.6304 0.3026
No log 2.2059 150 0.2990 0.6499 0.2990
No log 2.2353 152 0.3100 0.6986 0.3100
No log 2.2647 154 0.3029 0.6558 0.3029
No log 2.2941 156 0.3094 0.6451 0.3094
No log 2.3235 158 0.3189 0.6789 0.3189
No log 2.3529 160 0.3296 0.7205 0.3296
No log 2.3824 162 0.3857 0.7668 0.3857
No log 2.4118 164 0.3847 0.7738 0.3847
No log 2.4412 166 0.3288 0.7196 0.3288
No log 2.4706 168 0.3127 0.6857 0.3127
No log 2.5 170 0.3142 0.6321 0.3142
No log 2.5294 172 0.2971 0.6805 0.2971
No log 2.5588 174 0.2919 0.6809 0.2919
No log 2.5882 176 0.2883 0.6832 0.2883
No log 2.6176 178 0.2908 0.7045 0.2908
No log 2.6471 180 0.3183 0.7436 0.3183
No log 2.6765 182 0.3840 0.7915 0.3840
No log 2.7059 184 0.4421 0.8013 0.4421
No log 2.7353 186 0.4693 0.8022 0.4693
No log 2.7647 188 0.3933 0.7893 0.3933
No log 2.7941 190 0.3167 0.7355 0.3167
No log 2.8235 192 0.3053 0.6908 0.3053
No log 2.8529 194 0.3028 0.6759 0.3028
No log 2.8824 196 0.2999 0.6618 0.2999
No log 2.9118 198 0.2966 0.6730 0.2966
No log 2.9412 200 0.3041 0.6986 0.3041
No log 2.9706 202 0.3492 0.7601 0.3492
No log 3.0 204 0.3807 0.7895 0.3807
No log 3.0294 206 0.3448 0.7616 0.3448
No log 3.0588 208 0.2938 0.7110 0.2938
No log 3.0882 210 0.2832 0.6748 0.2832
No log 3.1176 212 0.2984 0.6126 0.2984
No log 3.1471 214 0.3016 0.6126 0.3016
No log 3.1765 216 0.2831 0.6494 0.2831
No log 3.2059 218 0.2895 0.7158 0.2895
No log 3.2353 220 0.3130 0.7480 0.3130
No log 3.2647 222 0.3255 0.7594 0.3255
No log 3.2941 224 0.3160 0.7489 0.3160
No log 3.3235 226 0.3049 0.7209 0.3049
No log 3.3529 228 0.2995 0.7190 0.2995
No log 3.3824 230 0.3001 0.7290 0.3001
No log 3.4118 232 0.3108 0.7388 0.3108
No log 3.4412 234 0.3102 0.7363 0.3102
No log 3.4706 236 0.3025 0.7162 0.3025
No log 3.5 238 0.2998 0.7035 0.2998
No log 3.5294 240 0.3007 0.7119 0.3007
No log 3.5588 242 0.3111 0.7343 0.3111
No log 3.5882 244 0.3146 0.7327 0.3146
No log 3.6176 246 0.2978 0.6769 0.2978
No log 3.6471 248 0.3000 0.6276 0.3000
No log 3.6765 250 0.3051 0.6143 0.3051
No log 3.7059 252 0.3019 0.6351 0.3019
No log 3.7353 254 0.3106 0.7052 0.3106
No log 3.7647 256 0.3545 0.7642 0.3545
No log 3.7941 258 0.3953 0.7867 0.3953
No log 3.8235 260 0.3837 0.7747 0.3837
No log 3.8529 262 0.3480 0.7461 0.3480
No log 3.8824 264 0.3239 0.7146 0.3239
No log 3.9118 266 0.3220 0.6736 0.3220
No log 3.9412 268 0.3211 0.6705 0.3211
No log 3.9706 270 0.3251 0.7223 0.3251
No log 4.0 272 0.3272 0.7300 0.3272
No log 4.0294 274 0.3318 0.7415 0.3318
No log 4.0588 276 0.3398 0.7610 0.3398
No log 4.0882 278 0.3381 0.7626 0.3381
No log 4.1176 280 0.3233 0.7381 0.3233
No log 4.1471 282 0.3073 0.7105 0.3073
No log 4.1765 284 0.3043 0.6954 0.3043
No log 4.2059 286 0.3047 0.6717 0.3047
No log 4.2353 288 0.3047 0.6814 0.3047
No log 4.2647 290 0.3073 0.7041 0.3073
No log 4.2941 292 0.3091 0.7089 0.3091
No log 4.3235 294 0.3070 0.7084 0.3070
No log 4.3529 296 0.3104 0.7163 0.3104
No log 4.3824 298 0.3123 0.7197 0.3123
No log 4.4118 300 0.3140 0.7269 0.3140
No log 4.4412 302 0.3164 0.7296 0.3164
No log 4.4706 304 0.3152 0.7273 0.3152
No log 4.5 306 0.3091 0.7130 0.3091
No log 4.5294 308 0.3050 0.7072 0.3050
No log 4.5588 310 0.3048 0.7058 0.3048
No log 4.5882 312 0.3085 0.7116 0.3085
No log 4.6176 314 0.3132 0.7236 0.3132
No log 4.6471 316 0.3216 0.7357 0.3216
No log 4.6765 318 0.3286 0.7448 0.3286
No log 4.7059 320 0.3381 0.7533 0.3381
No log 4.7353 322 0.3398 0.7547 0.3398
No log 4.7647 324 0.3416 0.7532 0.3416
No log 4.7941 326 0.3410 0.7532 0.3410
No log 4.8235 328 0.3387 0.7493 0.3387
No log 4.8529 330 0.3357 0.7492 0.3357
No log 4.8824 332 0.3319 0.7451 0.3319
No log 4.9118 334 0.3279 0.7365 0.3279
No log 4.9412 336 0.3242 0.7330 0.3242
No log 4.9706 338 0.3222 0.7336 0.3222
No log 5.0 340 0.3214 0.7303 0.3214

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1