csNoHug's picture
Training complete
44cf894
metadata
license: apache-2.0
base_model: google/electra-base-discriminator
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: electra-base-discriminator-finetuned-ner-cadec
    results: []

electra-base-discriminator-finetuned-ner-cadec

This model is a fine-tuned version of google/electra-base-discriminator on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4887
  • Precision: 0.6872
  • Recall: 0.7339
  • F1: 0.7097
  • Accuracy: 0.9230
  • Adr Precision: 0.6712
  • Adr Recall: 0.7483
  • Adr F1: 0.7076
  • Disease Precision: 0.1667
  • Disease Recall: 0.16
  • Disease F1: 0.1633
  • Drug Precision: 0.9509
  • Drug Recall: 0.9568
  • Drug F1: 0.9538
  • Finding Precision: 0.4074
  • Finding Recall: 0.3188
  • Finding F1: 0.3577
  • Symptom Precision: 0.5455
  • Symptom Recall: 0.6667
  • Symptom F1: 0.6
  • B-adr Precision: 0.7798
  • B-adr Recall: 0.8336
  • B-adr F1: 0.8058
  • B-disease Precision: 0.2174
  • B-disease Recall: 0.2
  • B-disease F1: 0.2083
  • B-drug Precision: 0.9753
  • B-drug Recall: 0.9753
  • B-drug F1: 0.9753
  • B-finding Precision: 0.5238
  • B-finding Recall: 0.3333
  • B-finding F1: 0.4074
  • B-symptom Precision: 0.6071
  • B-symptom Recall: 0.6296
  • B-symptom F1: 0.6182
  • I-adr Precision: 0.6587
  • I-adr Recall: 0.7267
  • I-adr F1: 0.6910
  • I-disease Precision: 0.1765
  • I-disease Recall: 0.15
  • I-disease F1: 0.1622
  • I-drug Precision: 0.9509
  • I-drug Recall: 0.9568
  • I-drug F1: 0.9538
  • I-finding Precision: 0.4565
  • I-finding Recall: 0.4038
  • I-finding F1: 0.4286
  • I-symptom Precision: 0.35
  • I-symptom Recall: 0.5385
  • I-symptom F1: 0.4242
  • Macro Avg F1: 0.5675
  • Weighted Avg F1: 0.7497

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy Adr Precision Adr Recall Adr F1 Disease Precision Disease Recall Disease F1 Drug Precision Drug Recall Drug F1 Finding Precision Finding Recall Finding F1 Symptom Precision Symptom Recall Symptom F1 B-adr Precision B-adr Recall B-adr F1 B-disease Precision B-disease Recall B-disease F1 B-drug Precision B-drug Recall B-drug F1 B-finding Precision B-finding Recall B-finding F1 B-symptom Precision B-symptom Recall B-symptom F1 I-adr Precision I-adr Recall I-adr F1 I-disease Precision I-disease Recall I-disease F1 I-drug Precision I-drug Recall I-drug F1 I-finding Precision I-finding Recall I-finding F1 I-symptom Precision I-symptom Recall I-symptom F1 Macro Avg F1 Weighted Avg F1
No log 1.0 125 0.3085 0.5456 0.6433 0.5904 0.9018 0.4847 0.6883 0.5689 0.0 0.0 0.0 0.8245 0.9568 0.8857 0.0 0.0 0.0 0.0 0.0 0.0 0.6618 0.8071 0.7273 0.0 0.0 0.0 0.8791 0.9877 0.9302 0.0 0.0 0.0 0.0 0.0 0.0 0.4666 0.6498 0.5431 0.0 0.0 0.0 0.8564 0.9568 0.9038 0.0 0.0 0.0 0.0 0.0 0.0 0.3104 0.6156
No log 2.0 250 0.2624 0.5760 0.6784 0.6230 0.9132 0.5557 0.7317 0.6317 0.0294 0.04 0.0339 0.8851 0.9506 0.9167 0.1190 0.0725 0.0901 0.0 0.0 0.0 0.7120 0.8531 0.7762 0.1429 0.04 0.0625 0.9294 0.9753 0.9518 0.55 0.1667 0.2558 0.0 0.0 0.0 0.5670 0.7105 0.6307 0.1333 0.2 0.16 0.9006 0.9506 0.9249 0.25 0.1346 0.1750 0.0 0.0 0.0 0.3937 0.6840
No log 3.0 375 0.2659 0.6090 0.6931 0.6483 0.9140 0.6231 0.7383 0.6758 0.0189 0.04 0.0256 0.9070 0.9630 0.9341 0.1765 0.1739 0.1752 0.0 0.0 0.0 0.7648 0.8230 0.7928 0.0588 0.04 0.0476 0.9357 0.9877 0.9610 0.4340 0.3485 0.3866 0.0 0.0 0.0 0.6021 0.7105 0.6518 0.1522 0.35 0.2121 0.9123 0.9630 0.9369 0.4138 0.2308 0.2963 0.0 0.0 0.0 0.4285 0.7086
0.2971 4.0 500 0.2659 0.6313 0.6999 0.6638 0.9200 0.6259 0.7333 0.6754 0.0286 0.04 0.0333 0.9277 0.9506 0.9390 0.2687 0.2609 0.2647 0.625 0.1852 0.2857 0.7625 0.8354 0.7973 0.0769 0.04 0.0526 0.9691 0.9691 0.9691 0.4706 0.3636 0.4103 0.875 0.2593 0.4000 0.6173 0.6923 0.6527 0.1613 0.25 0.1961 0.9333 0.9506 0.9419 0.3721 0.3077 0.3368 0.0 0.0 0.0 0.4757 0.7208
0.2971 5.0 625 0.2827 0.6124 0.7157 0.6601 0.9156 0.5915 0.765 0.6672 0.0690 0.08 0.0741 0.9281 0.9568 0.9422 0.1957 0.1304 0.1565 0.5 0.2593 0.3415 0.7261 0.8726 0.7926 0.2308 0.12 0.1579 0.9578 0.9815 0.9695 0.5143 0.2727 0.3564 0.8333 0.3704 0.5128 0.5910 0.7429 0.6583 0.16 0.2 0.1778 0.9337 0.9568 0.9451 0.3077 0.1538 0.2051 0.0 0.0 0.0 0.4776 0.7180
0.2971 6.0 750 0.2900 0.6347 0.6965 0.6641 0.9196 0.6116 0.7217 0.6621 0.1190 0.2 0.1493 0.9394 0.9568 0.9480 0.3235 0.1594 0.2136 0.55 0.4074 0.4681 0.7581 0.8319 0.7932 0.2333 0.28 0.2545 0.9695 0.9815 0.9755 0.4783 0.1667 0.2472 0.8 0.4444 0.5714 0.6007 0.7065 0.6493 0.125 0.2 0.1538 0.9451 0.9568 0.9509 0.4074 0.2115 0.2785 0.4 0.3077 0.3478 0.5222 0.7195
0.2971 7.0 875 0.2994 0.6300 0.7078 0.6667 0.9193 0.6148 0.7317 0.6682 0.1389 0.2 0.1639 0.9398 0.9630 0.9512 0.2745 0.2029 0.2333 0.44 0.4074 0.4231 0.7465 0.8602 0.7993 0.2 0.24 0.2182 0.9639 0.9877 0.9756 0.4211 0.2424 0.3077 0.7 0.5185 0.5957 0.6237 0.7045 0.6616 0.1379 0.2 0.1633 0.9398 0.9630 0.9512 0.3256 0.2692 0.2947 0.3077 0.3077 0.3077 0.5275 0.7283
0.0963 8.0 1000 0.3197 0.6426 0.7067 0.6731 0.9190 0.6237 0.7267 0.6713 0.0952 0.08 0.0870 0.9390 0.9506 0.9448 0.3175 0.2899 0.3030 0.5 0.4444 0.4706 0.7568 0.8319 0.7926 0.1053 0.08 0.0909 0.9755 0.9815 0.9785 0.4167 0.3030 0.3509 0.5652 0.4815 0.52 0.6151 0.7085 0.6585 0.0714 0.05 0.0588 0.9448 0.9506 0.9477 0.3929 0.4231 0.4074 0.3846 0.3846 0.3846 0.5190 0.7263
0.0963 9.0 1125 0.3281 0.6512 0.6976 0.6736 0.9216 0.6354 0.7117 0.6714 0.1667 0.24 0.1967 0.9277 0.9506 0.9390 0.3019 0.2319 0.2623 0.6842 0.4815 0.5652 0.7780 0.8248 0.8007 0.2258 0.28 0.25 0.9636 0.9815 0.9725 0.4857 0.2576 0.3366 0.8125 0.4815 0.6047 0.6386 0.6903 0.6634 0.12 0.15 0.1333 0.9333 0.9506 0.9419 0.3542 0.3269 0.34 0.5556 0.3846 0.4545 0.5498 0.7322
0.0963 10.0 1250 0.3405 0.6563 0.7180 0.6858 0.9222 0.6241 0.7417 0.6778 0.0952 0.08 0.0870 0.9563 0.9444 0.9503 0.4043 0.2754 0.3276 0.6 0.5556 0.5769 0.7504 0.8513 0.7977 0.2222 0.16 0.1860 0.9811 0.9630 0.9720 0.5263 0.3030 0.3846 0.6087 0.5185 0.5600 0.6394 0.7287 0.6812 0.0667 0.05 0.0571 0.9563 0.9444 0.9503 0.45 0.3462 0.3913 0.4615 0.4615 0.4615 0.5442 0.7385
0.0963 11.0 1375 0.3641 0.6731 0.7112 0.6916 0.9184 0.6439 0.7233 0.6813 0.2083 0.2 0.2041 0.9509 0.9568 0.9538 0.4524 0.2754 0.3423 0.5 0.5556 0.5263 0.7732 0.8265 0.7990 0.2609 0.24 0.2500 0.9753 0.9753 0.9753 0.5714 0.3030 0.3960 0.52 0.4815 0.5 0.6360 0.7146 0.6730 0.1667 0.15 0.1579 0.9568 0.9568 0.9568 0.4857 0.3269 0.3908 0.3333 0.4615 0.3871 0.5486 0.7385
0.0455 12.0 1500 0.3604 0.6567 0.6931 0.6744 0.9163 0.6387 0.6983 0.6672 0.2222 0.24 0.2308 0.9118 0.9568 0.9337 0.3273 0.2609 0.2903 0.5833 0.5185 0.5490 0.7770 0.8142 0.7952 0.24 0.24 0.24 0.9467 0.9877 0.9668 0.4762 0.3030 0.3704 0.7143 0.5556 0.6250 0.6338 0.6761 0.6543 0.15 0.15 0.15 0.9226 0.9568 0.9394 0.3333 0.3077 0.32 0.3846 0.3846 0.3846 0.5446 0.7271
0.0455 13.0 1625 0.3913 0.6685 0.7033 0.6854 0.9226 0.6358 0.7217 0.6760 0.1429 0.08 0.1026 0.9451 0.9568 0.9509 0.3913 0.2609 0.3130 0.5417 0.4815 0.5098 0.7820 0.8319 0.8062 0.1429 0.08 0.1026 0.9755 0.9815 0.9785 0.4737 0.2727 0.3462 0.6190 0.4815 0.5417 0.6282 0.6943 0.6596 0.1 0.05 0.0667 0.9509 0.9568 0.9538 0.4146 0.3269 0.3656 0.375 0.4615 0.4138 0.5235 0.7315
0.0455 14.0 1750 0.3691 0.6739 0.7022 0.6877 0.9216 0.6509 0.7117 0.6799 0.2143 0.24 0.2264 0.9509 0.9568 0.9538 0.3636 0.2899 0.3226 0.6667 0.4444 0.5333 0.7752 0.8177 0.7959 0.28 0.28 0.28 0.9877 0.9877 0.9877 0.4889 0.3333 0.3964 0.8235 0.5185 0.6364 0.6467 0.7004 0.6725 0.1429 0.15 0.1463 0.9509 0.9568 0.9538 0.3529 0.3462 0.3495 0.625 0.3846 0.4762 0.5695 0.7403
0.0455 15.0 1875 0.3913 0.6597 0.7180 0.6876 0.9197 0.6457 0.735 0.6875 0.1667 0.16 0.1633 0.9226 0.9568 0.9394 0.3273 0.2609 0.2903 0.5161 0.5926 0.5517 0.7720 0.8389 0.8041 0.2273 0.2 0.2128 0.9639 0.9877 0.9756 0.4565 0.3182 0.375 0.6 0.5556 0.5769 0.6395 0.7146 0.6750 0.1111 0.1 0.1053 0.9281 0.9568 0.9422 0.3478 0.3077 0.3265 0.2857 0.4615 0.3529 0.5346 0.7363
0.0227 16.0 2000 0.3938 0.6649 0.7123 0.6878 0.9195 0.6454 0.725 0.6829 0.2333 0.28 0.2545 0.9563 0.9444 0.9503 0.3276 0.2754 0.2992 0.625 0.5556 0.5882 0.7763 0.8230 0.7990 0.2857 0.32 0.3019 0.9874 0.9691 0.9782 0.4375 0.3182 0.3684 0.6667 0.5185 0.5833 0.6304 0.7045 0.6654 0.1905 0.2 0.1951 0.9563 0.9444 0.9503 0.3462 0.3462 0.3462 0.3846 0.3846 0.3846 0.5572 0.7358
0.0227 17.0 2125 0.4021 0.6727 0.7169 0.6941 0.9216 0.6652 0.735 0.6983 0.2083 0.2 0.2041 0.9387 0.9444 0.9415 0.3333 0.2609 0.2927 0.4324 0.5926 0.5 0.7781 0.8319 0.8041 0.2609 0.24 0.2500 0.9752 0.9691 0.9721 0.4773 0.3182 0.3818 0.5517 0.5926 0.5714 0.6574 0.7186 0.6867 0.1875 0.15 0.1667 0.9503 0.9444 0.9474 0.375 0.3462 0.3600 0.2917 0.5385 0.3784 0.5519 0.7430
0.0227 18.0 2250 0.4107 0.6917 0.7089 0.7002 0.9227 0.6692 0.7217 0.6945 0.1905 0.16 0.1739 0.9451 0.9568 0.9509 0.4286 0.3043 0.3559 0.5417 0.4815 0.5098 0.7838 0.8212 0.8021 0.25 0.2 0.2222 0.9695 0.9815 0.9755 0.5 0.3333 0.4 0.5909 0.4815 0.5306 0.6567 0.7085 0.6816 0.1333 0.1 0.1143 0.9451 0.9568 0.9509 0.4419 0.3654 0.4 0.3333 0.3846 0.3571 0.5434 0.7415
0.0227 19.0 2375 0.4206 0.6551 0.7033 0.6783 0.9199 0.6205 0.7167 0.6651 0.1429 0.08 0.1026 0.9341 0.9630 0.9483 0.4151 0.3188 0.3607 0.5238 0.4074 0.4583 0.7585 0.8336 0.7943 0.1538 0.08 0.1053 0.9581 0.9877 0.9726 0.4583 0.3333 0.3860 0.6190 0.4815 0.5417 0.6191 0.6943 0.6546 0.1 0.05 0.0667 0.9398 0.9630 0.9512 0.4651 0.3846 0.4211 0.3846 0.3846 0.3846 0.5278 0.7281
0.0128 20.0 2500 0.4455 0.6458 0.7248 0.6830 0.9171 0.6080 0.7367 0.6662 0.1176 0.08 0.0952 0.9515 0.9691 0.9602 0.4211 0.3478 0.3810 0.6 0.5556 0.5769 0.7393 0.8531 0.7921 0.2 0.12 0.15 0.9697 0.9877 0.9786 0.5208 0.3788 0.4386 0.6364 0.5185 0.5714 0.6079 0.7186 0.6586 0.0909 0.05 0.0645 0.9515 0.9691 0.9602 0.4681 0.4231 0.4444 0.4 0.4615 0.4286 0.5487 0.7346
0.0128 21.0 2625 0.4208 0.6798 0.7214 0.7 0.9210 0.6587 0.7333 0.6940 0.2273 0.2 0.2128 0.9398 0.9630 0.9512 0.3929 0.3188 0.3520 0.56 0.5185 0.5385 0.7725 0.8354 0.8027 0.2727 0.24 0.2553 0.9636 0.9815 0.9725 0.4667 0.3182 0.3784 0.6087 0.5185 0.5600 0.6501 0.7146 0.6808 0.1538 0.1 0.1212 0.9398 0.9630 0.9512 0.4167 0.3846 0.4 0.3846 0.3846 0.3846 0.5507 0.7416
0.0128 22.0 2750 0.4410 0.6531 0.7248 0.6871 0.9182 0.6298 0.74 0.6805 0.1923 0.2 0.1961 0.9398 0.9630 0.9512 0.3636 0.2899 0.3226 0.5357 0.5556 0.5455 0.7536 0.8442 0.7963 0.25 0.24 0.2449 0.9695 0.9815 0.9755 0.5 0.3182 0.3889 0.6 0.5556 0.5769 0.6277 0.7166 0.6692 0.1579 0.15 0.1538 0.9398 0.9630 0.9512 0.3913 0.3462 0.3673 0.3333 0.3846 0.3571 0.5481 0.7357
0.0128 23.0 2875 0.4560 0.6790 0.7282 0.7027 0.9213 0.6544 0.7417 0.6953 0.2105 0.16 0.1818 0.9398 0.9630 0.9512 0.4 0.3188 0.3548 0.5926 0.5926 0.5926 0.7622 0.8283 0.7939 0.3333 0.24 0.2791 0.9636 0.9815 0.9725 0.5 0.3485 0.4107 0.6667 0.5926 0.6275 0.6461 0.7206 0.6813 0.1 0.05 0.0667 0.9398 0.9630 0.9512 0.4348 0.3846 0.4082 0.3571 0.3846 0.3704 0.5561 0.7410
0.0082 24.0 3000 0.4642 0.6646 0.7203 0.6913 0.9202 0.6421 0.7267 0.6818 0.2414 0.28 0.2593 0.9341 0.9630 0.9483 0.42 0.3043 0.3529 0.5 0.5926 0.5424 0.7651 0.8301 0.7963 0.2593 0.28 0.2692 0.9636 0.9815 0.9725 0.5238 0.3333 0.4074 0.6296 0.6296 0.6296 0.6309 0.7024 0.6648 0.1818 0.2 0.1905 0.9341 0.9630 0.9483 0.4390 0.3462 0.3871 0.2778 0.3846 0.3226 0.5588 0.7366
0.0082 25.0 3125 0.4606 0.6712 0.7214 0.6954 0.9187 0.6485 0.735 0.6891 0.1852 0.2 0.1923 0.9568 0.9568 0.9568 0.3704 0.2899 0.3252 0.6154 0.5926 0.6038 0.7687 0.8177 0.7925 0.24 0.24 0.24 0.9753 0.9753 0.9753 0.5116 0.3333 0.4037 0.625 0.5556 0.5882 0.6263 0.7126 0.6667 0.1579 0.15 0.1538 0.9568 0.9568 0.9568 0.3864 0.3269 0.3542 0.3846 0.3846 0.3846 0.5516 0.7346
0.0082 26.0 3250 0.4470 0.6767 0.7373 0.7057 0.9206 0.6556 0.755 0.7018 0.1875 0.12 0.1463 0.9341 0.9630 0.9483 0.3934 0.3478 0.3692 0.5556 0.5556 0.5556 0.7565 0.8301 0.7916 0.2 0.12 0.15 0.9639 0.9877 0.9756 0.4808 0.3788 0.4237 0.6 0.5556 0.5769 0.6430 0.7328 0.6850 0.1 0.05 0.0667 0.9341 0.9630 0.9483 0.4118 0.4038 0.4078 0.3846 0.3846 0.3846 0.5410 0.7390
0.0082 27.0 3375 0.4674 0.6727 0.7123 0.6920 0.9215 0.6473 0.725 0.6840 0.2222 0.24 0.2308 0.9394 0.9568 0.9480 0.3913 0.2609 0.3130 0.6 0.5556 0.5769 0.7655 0.8319 0.7973 0.28 0.28 0.28 0.9695 0.9815 0.9755 0.5143 0.2727 0.3564 0.6522 0.5556 0.6 0.6496 0.6943 0.6712 0.2 0.2 0.2000 0.9451 0.9568 0.9509 0.4 0.3077 0.3478 0.5 0.4615 0.4800 0.5659 0.7372
0.0062 28.0 3500 0.4693 0.6736 0.7361 0.7035 0.9203 0.6455 0.7617 0.6988 0.1818 0.16 0.1702 0.9509 0.9568 0.9538 0.4167 0.2899 0.3419 0.5833 0.5185 0.5490 0.7564 0.8407 0.7963 0.1905 0.16 0.1739 0.9755 0.9815 0.9785 0.4634 0.2879 0.3551 0.625 0.5556 0.5882 0.6354 0.7409 0.6841 0.125 0.1 0.1111 0.9509 0.9568 0.9538 0.4524 0.3654 0.4043 0.3846 0.3846 0.3846 0.5430 0.7395
0.0062 29.0 3625 0.4681 0.6905 0.7327 0.7110 0.9225 0.6682 0.7483 0.7060 0.1739 0.16 0.1667 0.9455 0.9630 0.9541 0.4630 0.3623 0.4065 0.5652 0.4815 0.52 0.7648 0.8230 0.7928 0.1905 0.16 0.1739 0.9695 0.9815 0.9755 0.5455 0.3636 0.4364 0.6087 0.5185 0.5600 0.6538 0.7186 0.6847 0.1176 0.1 0.1081 0.9455 0.9630 0.9541 0.4681 0.4231 0.4444 0.4286 0.4615 0.4444 0.5574 0.7428
0.0062 30.0 3750 0.4771 0.6765 0.7248 0.6998 0.9201 0.6562 0.735 0.6934 0.2273 0.2 0.2128 0.9451 0.9568 0.9509 0.3729 0.3188 0.3438 0.5862 0.6296 0.6071 0.7686 0.8230 0.7949 0.25 0.2 0.2222 0.9695 0.9815 0.9755 0.4583 0.3333 0.3860 0.6667 0.6667 0.6667 0.6383 0.7146 0.6743 0.1333 0.1 0.1143 0.9451 0.9568 0.9509 0.3922 0.3846 0.3883 0.375 0.4615 0.4138 0.5587 0.7384
0.0062 31.0 3875 0.4789 0.6874 0.7271 0.7067 0.9213 0.6702 0.745 0.7056 0.16 0.16 0.16 0.9341 0.9630 0.9483 0.4694 0.3333 0.3898 0.4615 0.4444 0.4528 0.7673 0.8230 0.7942 0.2273 0.2 0.2128 0.9639 0.9877 0.9756 0.5476 0.3485 0.4259 0.5833 0.5185 0.5490 0.6544 0.7206 0.6859 0.1667 0.15 0.1579 0.9341 0.9630 0.9483 0.4651 0.3846 0.4211 0.4 0.4615 0.4286 0.5599 0.7428
0.0043 32.0 4000 0.4845 0.6969 0.7316 0.7138 0.9225 0.6732 0.7417 0.7058 0.1923 0.2 0.1961 0.9571 0.9630 0.9600 0.4898 0.3478 0.4068 0.5714 0.5926 0.5818 0.7819 0.8248 0.8028 0.25 0.24 0.2449 0.9755 0.9815 0.9785 0.5349 0.3485 0.4220 0.6154 0.5926 0.6038 0.6568 0.7166 0.6854 0.1667 0.15 0.1579 0.9571 0.9630 0.9600 0.5116 0.4231 0.4632 0.4 0.4615 0.4286 0.5747 0.7498
0.0043 33.0 4125 0.4683 0.6809 0.7248 0.7021 0.9235 0.6577 0.7367 0.6950 0.15 0.12 0.1333 0.9394 0.9568 0.9480 0.4211 0.3478 0.3810 0.6154 0.5926 0.6038 0.7829 0.8425 0.8116 0.1579 0.12 0.1364 0.9695 0.9815 0.9755 0.5217 0.3636 0.4286 0.64 0.5926 0.6154 0.6487 0.7065 0.6764 0.1333 0.1 0.1143 0.9451 0.9568 0.9509 0.4490 0.4231 0.4356 0.5 0.5385 0.5185 0.5663 0.7470
0.0043 34.0 4250 0.4860 0.6896 0.7373 0.7126 0.9213 0.6657 0.7533 0.7068 0.15 0.12 0.1333 0.9337 0.9568 0.9451 0.4815 0.3768 0.4228 0.6 0.5556 0.5769 0.7634 0.8336 0.7970 0.1579 0.12 0.1364 0.9634 0.9753 0.9693 0.5208 0.3788 0.4386 0.64 0.5926 0.6154 0.6468 0.7267 0.6845 0.1875 0.15 0.1667 0.9337 0.9568 0.9451 0.5111 0.4423 0.4742 0.4167 0.3846 0.4 0.5627 0.7445
0.0043 35.0 4375 0.4857 0.6970 0.7373 0.7166 0.9220 0.6736 0.7533 0.7113 0.2 0.16 0.1778 0.9509 0.9568 0.9538 0.4808 0.3623 0.4132 0.5357 0.5556 0.5455 0.7761 0.8283 0.8014 0.2 0.16 0.1778 0.9753 0.9753 0.9753 0.5217 0.3636 0.4286 0.5926 0.5926 0.5926 0.6528 0.7308 0.6896 0.1875 0.15 0.1667 0.9509 0.9568 0.9538 0.4889 0.4231 0.4536 0.4 0.4615 0.4286 0.5668 0.7485
0.0037 36.0 4500 0.4816 0.6912 0.7350 0.7124 0.9228 0.6657 0.75 0.7053 0.1905 0.16 0.1739 0.9509 0.9568 0.9538 0.4706 0.3478 0.4000 0.5714 0.5926 0.5818 0.7745 0.8389 0.8054 0.1905 0.16 0.1739 0.9693 0.9753 0.9723 0.5111 0.3485 0.4144 0.6154 0.5926 0.6038 0.6557 0.7287 0.6903 0.1875 0.15 0.1667 0.9509 0.9568 0.9538 0.5 0.4231 0.4583 0.4667 0.5385 0.5 0.5739 0.7502
0.0037 37.0 4625 0.4878 0.6919 0.7350 0.7128 0.9229 0.6731 0.7517 0.7102 0.1667 0.16 0.1633 0.9568 0.9568 0.9568 0.4528 0.3478 0.3934 0.5172 0.5556 0.5357 0.7776 0.8354 0.8055 0.2174 0.2 0.2083 0.9753 0.9753 0.9753 0.5349 0.3485 0.4220 0.5926 0.5926 0.5926 0.6612 0.7308 0.6942 0.1765 0.15 0.1622 0.9568 0.9568 0.9568 0.4783 0.4231 0.4490 0.3889 0.5385 0.4516 0.5717 0.7519
0.0037 38.0 4750 0.4890 0.6876 0.7305 0.7084 0.9235 0.6652 0.745 0.7028 0.1429 0.12 0.1304 0.9568 0.9568 0.9568 0.4444 0.3478 0.3902 0.5517 0.5926 0.5714 0.7759 0.8336 0.8038 0.1579 0.12 0.1364 0.9753 0.9753 0.9753 0.5227 0.3485 0.4182 0.5926 0.5926 0.5926 0.6545 0.7247 0.6878 0.1765 0.15 0.1622 0.9568 0.9568 0.9568 0.4468 0.4038 0.4242 0.4118 0.5385 0.4667 0.5624 0.7473
0.0037 39.0 4875 0.4871 0.6869 0.7305 0.7080 0.9226 0.6722 0.745 0.7067 0.1667 0.16 0.1633 0.9509 0.9568 0.9538 0.4074 0.3188 0.3577 0.5152 0.6296 0.5667 0.7789 0.8354 0.8061 0.2174 0.2 0.2083 0.9693 0.9753 0.9723 0.5238 0.3333 0.4074 0.6071 0.6296 0.6182 0.6593 0.7206 0.6886 0.1765 0.15 0.1622 0.9509 0.9568 0.9538 0.4348 0.3846 0.4082 0.35 0.5385 0.4242 0.5649 0.7481
0.003 40.0 5000 0.4887 0.6872 0.7339 0.7097 0.9230 0.6712 0.7483 0.7076 0.1667 0.16 0.1633 0.9509 0.9568 0.9538 0.4074 0.3188 0.3577 0.5455 0.6667 0.6 0.7798 0.8336 0.8058 0.2174 0.2 0.2083 0.9753 0.9753 0.9753 0.5238 0.3333 0.4074 0.6071 0.6296 0.6182 0.6587 0.7267 0.6910 0.1765 0.15 0.1622 0.9509 0.9568 0.9538 0.4565 0.4038 0.4286 0.35 0.5385 0.4242 0.5675 0.7497

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0