MikeZQZ's picture
MikeZQZ/model_0912
b4baab7 verified
|
raw
history blame
20.1 kB
metadata
license: mit
base_model: microsoft/deberta-v3-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: added_token_resized_model_test_0825
    results: []

added_token_resized_model_test_0825

This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0004
  • Accuracy: 1.0000
  • F1: 0.9981
  • Precision: 0.9961
  • Recall: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
No log 1.0 123 0.0578 0.9938 0.0 0.0 0.0
No log 2.0 246 0.0371 0.9938 0.0 0.0 0.0
No log 3.0 369 0.0358 0.9938 0.0 0.0 0.0
No log 4.0 492 0.0339 0.9938 0.0 0.0 0.0
0.0968 5.0 615 0.0329 0.9938 0.0 0.0 0.0
0.0968 6.0 738 0.0293 0.9938 0.0 0.0 0.0
0.0968 7.0 861 0.0257 0.9936 0.0223 0.2727 0.0116
0.0968 8.0 984 0.0243 0.9938 0.0 0.0 0.0
0.0299 9.0 1107 0.0239 0.9938 0.0 0.0 0.0
0.0299 10.0 1230 0.0249 0.9934 0.3089 0.4453 0.2364
0.0299 11.0 1353 0.0226 0.9938 0.0 0.0 0.0
0.0299 12.0 1476 0.0236 0.9938 0.1623 0.5 0.0969
0.0249 13.0 1599 0.0213 0.9935 0.0 0.0 0.0
0.0249 14.0 1722 0.0203 0.9939 0.1759 0.5510 0.1047
0.0249 15.0 1845 0.0190 0.9939 0.2025 0.5517 0.1240
0.0249 16.0 1968 0.0192 0.9938 0.0 0.0 0.0
0.0218 17.0 2091 0.0177 0.9942 0.1495 0.9130 0.0814
0.0218 18.0 2214 0.0178 0.9950 0.4607 0.7097 0.3411
0.0218 19.0 2337 0.0168 0.9948 0.4216 0.6964 0.3023
0.0218 20.0 2460 0.0171 0.9947 0.3323 0.8060 0.2093
0.0184 21.0 2583 0.0151 0.9953 0.4670 0.8019 0.3295
0.0184 22.0 2706 0.0144 0.9955 0.5131 0.7903 0.3798
0.0184 23.0 2829 0.0140 0.9955 0.5507 0.7308 0.4419
0.0184 24.0 2952 0.0144 0.9954 0.4863 0.8241 0.3450
0.016 25.0 3075 0.0131 0.9958 0.5940 0.7399 0.4961
0.016 26.0 3198 0.0133 0.9960 0.5951 0.8026 0.4729
0.016 27.0 3321 0.0134 0.9959 0.5947 0.7799 0.4806
0.016 28.0 3444 0.0124 0.9961 0.6165 0.7844 0.5078
0.0141 29.0 3567 0.0118 0.9961 0.6279 0.7849 0.5233
0.0141 30.0 3690 0.0114 0.9962 0.6165 0.8247 0.4922
0.0141 31.0 3813 0.0114 0.9960 0.6404 0.7374 0.5659
0.0141 32.0 3936 0.0116 0.9958 0.5865 0.7722 0.4729
0.013 33.0 4059 0.0105 0.9966 0.6847 0.8172 0.5891
0.013 34.0 4182 0.0099 0.9968 0.6968 0.8370 0.5969
0.013 35.0 4305 0.0109 0.9960 0.6224 0.7598 0.5271
0.013 36.0 4428 0.0094 0.9967 0.7054 0.7923 0.6357
0.0107 37.0 4551 0.0091 0.9967 0.6953 0.8324 0.5969
0.0107 38.0 4674 0.0090 0.9968 0.7061 0.8131 0.6240
0.0107 39.0 4797 0.0090 0.9968 0.7158 0.8128 0.6395
0.0107 40.0 4920 0.0086 0.9969 0.7205 0.825 0.6395
0.0104 41.0 5043 0.0081 0.9972 0.7417 0.8615 0.6512
0.0104 42.0 5166 0.0080 0.9972 0.7619 0.8178 0.7132
0.0104 43.0 5289 0.0083 0.9971 0.7463 0.8294 0.6783
0.0104 44.0 5412 0.0073 0.9975 0.7794 0.8708 0.7054
0.0087 45.0 5535 0.0072 0.9976 0.7851 0.9040 0.6938
0.0087 46.0 5658 0.0070 0.9977 0.8025 0.8761 0.7403
0.0087 47.0 5781 0.0093 0.9965 0.6883 0.7794 0.6163
0.0087 48.0 5904 0.0087 0.9967 0.6852 0.8506 0.5736
0.0099 49.0 6027 0.0081 0.9969 0.7351 0.7817 0.6938
0.0099 50.0 6150 0.0080 0.9969 0.7078 0.8611 0.6008
0.0099 51.0 6273 0.0075 0.9971 0.7424 0.85 0.6589
0.0099 52.0 6396 0.0076 0.9970 0.7389 0.8169 0.6744
0.0087 53.0 6519 0.0087 0.9969 0.7156 0.8385 0.6240
0.0087 54.0 6642 0.0092 0.9963 0.6308 0.8543 0.5
0.0087 55.0 6765 0.0082 0.9966 0.6887 0.8 0.6047
0.0087 56.0 6888 0.0078 0.9967 0.6882 0.8514 0.5775
0.0089 57.0 7011 0.0077 0.9970 0.7325 0.8434 0.6473
0.0089 58.0 7134 0.0077 0.9969 0.7221 0.8291 0.6395
0.0089 59.0 7257 0.0075 0.9971 0.7375 0.8374 0.6589
0.0089 60.0 7380 0.0074 0.9973 0.7559 0.8537 0.6783
0.0074 61.0 7503 0.0083 0.9967 0.6822 0.8588 0.5659
0.0074 62.0 7626 0.0079 0.9971 0.7341 0.8477 0.6473
0.0074 63.0 7749 0.0075 0.9970 0.7089 0.8988 0.5853
0.0074 64.0 7872 0.0078 0.9970 0.7156 0.8764 0.6047
0.0074 65.0 7995 0.0075 0.9972 0.7473 0.8629 0.6589
0.0077 66.0 8118 0.0070 0.9972 0.7516 0.8488 0.6744
0.0077 67.0 8241 0.0070 0.9973 0.7603 0.8585 0.6822
0.0077 68.0 8364 0.0066 0.9974 0.7662 0.8676 0.6860
0.0077 69.0 8487 0.0070 0.9972 0.7421 0.8913 0.6357
0.0065 70.0 8610 0.0073 0.9971 0.7170 0.9157 0.5891
0.0065 71.0 8733 0.0068 0.9972 0.7361 0.9138 0.6163
0.0065 72.0 8856 0.0069 0.9974 0.7682 0.8606 0.6938
0.0065 73.0 8979 0.0070 0.9974 0.7735 0.8619 0.7016
0.0062 74.0 9102 0.0070 0.9974 0.7735 0.8619 0.7016
0.0062 75.0 9225 0.0064 0.9974 0.7745 0.8585 0.7054
0.0062 76.0 9348 0.0067 0.9973 0.7588 0.8737 0.6705
0.0062 77.0 9471 0.0068 0.9974 0.7483 0.9257 0.6279
0.0056 78.0 9594 0.0074 0.9975 0.7689 0.9010 0.6705
0.0056 79.0 9717 0.0063 0.9975 0.7839 0.8645 0.7171
0.0056 80.0 9840 0.0061 0.9975 0.7664 0.9235 0.6550
0.0056 81.0 9963 0.0060 0.9976 0.7863 0.8762 0.7132
0.0058 82.0 10086 0.0060 0.9976 0.7889 0.8768 0.7171
0.0058 83.0 10209 0.0067 0.9975 0.7664 0.9235 0.6550
0.0058 84.0 10332 0.0063 0.9976 0.7898 0.8732 0.7209
0.0058 85.0 10455 0.0077 0.9971 0.7136 0.9545 0.5698
0.005 86.0 10578 0.0090 0.9971 0.7076 0.9664 0.5581
0.005 87.0 10701 0.0113 0.9972 0.7246 0.9615 0.5814
0.005 88.0 10824 0.0089 0.9973 0.7316 0.9448 0.5969
0.005 89.0 10947 0.0118 0.9972 0.7202 0.9673 0.5736
0.0048 90.0 11070 0.0127 0.9973 0.7338 0.9623 0.5930
0.0048 91.0 11193 0.0099 0.9974 0.7404 0.9747 0.5969
0.0048 92.0 11316 0.0107 0.9974 0.7373 0.9745 0.5930
0.0048 93.0 11439 0.0097 0.9975 0.7588 0.9586 0.6279
0.0046 94.0 11562 0.0098 0.9974 0.7523 0.9213 0.6357
0.0046 95.0 11685 0.0039 0.9986 0.8792 0.9505 0.8178
0.0046 96.0 11808 0.0037 0.9986 0.8824 0.9633 0.8140
0.0046 97.0 11931 0.0032 0.9989 0.9105 0.8951 0.9264
0.0037 98.0 12054 0.0030 0.9988 0.9052 0.9035 0.9070
0.0037 99.0 12177 0.0030 0.9989 0.9127 0.9350 0.8915
0.0037 100.0 12300 0.0030 0.9990 0.9189 0.9154 0.9225
0.0037 101.0 12423 0.0027 0.9991 0.924 0.9545 0.8953
0.0035 102.0 12546 0.0086 0.9984 0.8578 0.9660 0.7713
0.0035 103.0 12669 0.0084 0.9984 0.8650 0.9491 0.7946
0.0035 104.0 12792 0.0084 0.9985 0.8686 0.9579 0.7946
0.0035 105.0 12915 0.0088 0.9983 0.8530 0.9156 0.7984
0.0032 106.0 13038 0.0086 0.9983 0.8468 0.9387 0.7713
0.0032 107.0 13161 0.0025 0.9991 0.9276 0.9368 0.9186
0.0032 108.0 13284 0.0024 0.9991 0.9300 0.9336 0.9264
0.0032 109.0 13407 0.0024 0.9991 0.9261 0.9547 0.8992
0.0031 110.0 13530 0.0024 0.9991 0.9270 0.9438 0.9109
0.0031 111.0 13653 0.0023 0.9993 0.9409 0.956 0.9264
0.0031 112.0 13776 0.0024 0.9991 0.9294 0.9405 0.9186
0.0031 113.0 13899 0.0022 0.9993 0.9414 0.9488 0.9341
0.0027 114.0 14022 0.0021 0.9993 0.9432 0.9526 0.9341
0.0027 115.0 14145 0.0023 0.9991 0.9264 0.9510 0.9031
0.0027 116.0 14268 0.0022 0.9993 0.9425 0.9318 0.9535
0.0027 117.0 14391 0.0022 0.9991 0.9295 0.9139 0.9457
0.0026 118.0 14514 0.0020 0.9992 0.9403 0.9349 0.9457
0.0026 119.0 14637 0.0018 0.9994 0.9496 0.9496 0.9496
0.0026 120.0 14760 0.0018 0.9993 0.9478 0.9459 0.9496
0.0026 121.0 14883 0.0020 0.9993 0.9439 0.9421 0.9457
0.0024 122.0 15006 0.0021 0.9992 0.9391 0.9522 0.9264
0.0024 123.0 15129 0.0017 0.9995 0.9614 0.9577 0.9651
0.0024 124.0 15252 0.0017 0.9995 0.9595 0.9540 0.9651
0.0024 125.0 15375 0.0016 0.9995 0.9609 0.9685 0.9535
0.0024 126.0 15498 0.0018 0.9994 0.9520 0.9430 0.9612
0.0022 127.0 15621 0.0016 0.9995 0.9577 0.9504 0.9651
0.0022 128.0 15744 0.0017 0.9994 0.9557 0.9502 0.9612
0.0022 129.0 15867 0.0015 0.9996 0.9651 0.9651 0.9651
0.0022 130.0 15990 0.0015 0.9996 0.9655 0.9545 0.9767
0.002 131.0 16113 0.0014 0.9996 0.9670 0.9689 0.9651
0.002 132.0 16236 0.0015 0.9995 0.9580 0.9436 0.9729
0.002 133.0 16359 0.0014 0.9996 0.9690 0.9690 0.9690
0.002 134.0 16482 0.0013 0.9996 0.9689 0.9727 0.9651
0.0018 135.0 16605 0.0013 0.9996 0.9654 0.9580 0.9729
0.0018 136.0 16728 0.0013 0.9996 0.9650 0.9688 0.9612
0.0018 137.0 16851 0.0013 0.9996 0.9670 0.9689 0.9651
0.0018 138.0 16974 0.0012 0.9996 0.9691 0.9654 0.9729
0.0016 139.0 17097 0.0012 0.9996 0.9674 0.9582 0.9767
0.0016 140.0 17220 0.0011 0.9997 0.9750 0.9693 0.9806
0.0016 141.0 17343 0.0011 0.9998 0.9826 0.9807 0.9845
0.0016 142.0 17466 0.0011 0.9996 0.9712 0.9620 0.9806
0.0015 143.0 17589 0.0011 0.9997 0.9746 0.9842 0.9651
0.0015 144.0 17712 0.0012 0.9997 0.9752 0.9588 0.9922
0.0015 145.0 17835 0.0010 0.9997 0.9787 0.9768 0.9806
0.0015 146.0 17958 0.0009 0.9998 0.9845 0.9845 0.9845
0.0014 147.0 18081 0.0010 0.9997 0.9747 0.9804 0.9690
0.0014 148.0 18204 0.0009 0.9999 0.9883 0.9922 0.9845
0.0014 149.0 18327 0.0010 0.9997 0.9786 0.9843 0.9729
0.0014 150.0 18450 0.0009 0.9998 0.9827 0.9770 0.9884
0.0012 151.0 18573 0.0009 0.9999 0.9884 0.9884 0.9884
0.0012 152.0 18696 0.0008 0.9998 0.9864 0.9922 0.9806
0.0012 153.0 18819 0.0008 0.9999 0.9903 0.9884 0.9922
0.0012 154.0 18942 0.0008 0.9999 0.9884 0.9846 0.9922
0.0012 155.0 19065 0.0008 0.9999 0.9884 0.9846 0.9922
0.0012 156.0 19188 0.0007 0.9999 0.9885 0.9809 0.9961
0.0012 157.0 19311 0.0007 0.9999 0.9923 0.9885 0.9961
0.0012 158.0 19434 0.0007 0.9999 0.9903 0.9884 0.9922
0.001 159.0 19557 0.0007 0.9999 0.9884 0.9846 0.9922
0.001 160.0 19680 0.0007 0.9999 0.9903 0.9922 0.9884
0.001 161.0 19803 0.0007 0.9999 0.9922 0.9922 0.9922
0.001 162.0 19926 0.0006 0.9999 0.9922 0.9961 0.9884
0.0009 163.0 20049 0.0008 0.9998 0.9806 0.9806 0.9806
0.0009 164.0 20172 0.0006 0.9999 0.9903 0.9884 0.9922
0.0009 165.0 20295 0.0006 0.9999 0.9942 0.9961 0.9922
0.0009 166.0 20418 0.0006 1.0000 0.9961 0.9961 0.9961
0.0008 167.0 20541 0.0006 0.9999 0.9942 0.9923 0.9961
0.0008 168.0 20664 0.0006 0.9999 0.9922 0.9922 0.9922
0.0008 169.0 20787 0.0006 0.9999 0.9942 1.0 0.9884
0.0008 170.0 20910 0.0006 1.0000 0.9981 0.9961 1.0
0.0009 171.0 21033 0.0006 0.9999 0.9942 0.9923 0.9961
0.0009 172.0 21156 0.0005 1.0000 0.9961 0.9961 0.9961
0.0009 173.0 21279 0.0005 1.0000 0.9961 0.9961 0.9961
0.0009 174.0 21402 0.0005 1.0000 0.9961 0.9961 0.9961
0.0007 175.0 21525 0.0005 1.0000 0.9961 0.9961 0.9961
0.0007 176.0 21648 0.0005 1.0000 0.9981 0.9961 1.0
0.0007 177.0 21771 0.0005 1.0000 0.9961 0.9961 0.9961
0.0007 178.0 21894 0.0005 1.0000 0.9981 0.9961 1.0
0.0007 179.0 22017 0.0005 1.0000 0.9961 0.9961 0.9961
0.0007 180.0 22140 0.0005 1.0000 0.9981 0.9961 1.0
0.0007 181.0 22263 0.0005 1.0000 0.9981 0.9961 1.0
0.0007 182.0 22386 0.0004 1.0000 0.9981 0.9961 1.0
0.0007 183.0 22509 0.0004 1.0000 0.9981 0.9961 1.0
0.0007 184.0 22632 0.0004 1.0000 0.9981 0.9961 1.0
0.0007 185.0 22755 0.0005 1.0000 0.9961 0.9923 1.0
0.0007 186.0 22878 0.0004 1.0 1.0 1.0 1.0
0.0006 187.0 23001 0.0004 1.0 1.0 1.0 1.0
0.0006 188.0 23124 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 189.0 23247 0.0004 1.0 1.0 1.0 1.0
0.0006 190.0 23370 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 191.0 23493 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 192.0 23616 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 193.0 23739 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 194.0 23862 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 195.0 23985 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 196.0 24108 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 197.0 24231 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 198.0 24354 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 199.0 24477 0.0004 1.0000 0.9981 0.9961 1.0
0.0006 200.0 24600 0.0004 1.0000 0.9981 0.9961 1.0

Framework versions

  • Transformers 4.39.2
  • Pytorch 2.2.2+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2