Edit model card

added_token_resized_model_test

This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0009
  • Accuracy: 0.9997
  • F1: 0.9747
  • Precision: 0.9747
  • Recall: 0.9747

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
No log 1.0 123 0.0542 0.9950 0.0 0.0 0.0
No log 2.0 246 0.0312 0.9950 0.0 0.0 0.0
No log 3.0 369 0.0299 0.9950 0.0 0.0 0.0
No log 4.0 492 0.0254 0.9950 0.0 0.0 0.0
0.0917 5.0 615 0.0243 0.9950 0.0 0.0 0.0
0.0917 6.0 738 0.0216 0.9950 0.0 0.0 0.0
0.0917 7.0 861 0.0205 0.9950 0.1772 0.5385 0.1061
0.0917 8.0 984 0.0202 0.9950 0.0 0.0 0.0
0.0233 9.0 1107 0.0203 0.9950 0.0 0.0 0.0
0.0233 10.0 1230 0.0183 0.9950 0.0 0.0 0.0
0.0233 11.0 1353 0.0167 0.9949 0.1532 0.4865 0.0909
0.0233 12.0 1476 0.0155 0.9961 0.4138 0.8571 0.2727
0.0189 13.0 1599 0.0168 0.9956 0.3663 0.6667 0.2525
0.0189 14.0 1722 0.0193 0.9942 0.2253 0.3474 0.1667
0.0189 15.0 1845 0.0164 0.9958 0.3265 0.8511 0.2020
0.0189 16.0 1968 0.0141 0.9961 0.4412 0.8108 0.3030
0.0173 17.0 2091 0.0160 0.9952 0.3188 0.5641 0.2222
0.0173 18.0 2214 0.0151 0.9960 0.4222 0.7917 0.2879
0.0173 19.0 2337 0.0131 0.9964 0.5086 0.7957 0.3737
0.0173 20.0 2460 0.0126 0.9963 0.5118 0.7677 0.3838
0.014 21.0 2583 0.0122 0.9963 0.5068 0.7872 0.3737
0.014 22.0 2706 0.0118 0.9965 0.5018 0.8642 0.3535
0.014 23.0 2829 0.0114 0.9967 0.6066 0.7481 0.5101
0.014 24.0 2952 0.0110 0.9967 0.5868 0.7815 0.4697
0.0128 25.0 3075 0.0111 0.9967 0.5841 0.7863 0.4646
0.0128 26.0 3198 0.0110 0.9966 0.5263 0.8621 0.3788
0.0128 27.0 3321 0.0110 0.9967 0.6084 0.7537 0.5101
0.0128 28.0 3444 0.0102 0.9969 0.6277 0.8031 0.5152
0.0118 29.0 3567 0.0098 0.9969 0.6019 0.8378 0.4697
0.0118 30.0 3690 0.0097 0.9971 0.6349 0.8547 0.5051
0.0118 31.0 3813 0.0098 0.9970 0.6361 0.8062 0.5253
0.0118 32.0 3936 0.0096 0.9969 0.6591 0.7532 0.5859
0.0105 33.0 4059 0.0089 0.9971 0.6415 0.85 0.5152
0.0105 34.0 4182 0.0093 0.9971 0.6686 0.7877 0.5808
0.0105 35.0 4305 0.0089 0.9969 0.6630 0.7317 0.6061
0.0105 36.0 4428 0.0083 0.9972 0.6746 0.8143 0.5758
0.0093 37.0 4551 0.0084 0.9971 0.6566 0.8134 0.5505
0.0093 38.0 4674 0.0081 0.9971 0.6326 0.8609 0.5
0.0093 39.0 4797 0.0087 0.9970 0.6507 0.7956 0.5505
0.0093 40.0 4920 0.0080 0.9972 0.6707 0.8235 0.5657
0.0097 41.0 5043 0.0077 0.9972 0.6744 0.7945 0.5859
0.0097 42.0 5166 0.0074 0.9973 0.6899 0.8095 0.6010
0.0097 43.0 5289 0.0071 0.9974 0.7038 0.8392 0.6061
0.0097 44.0 5412 0.0071 0.9975 0.7110 0.8311 0.6212
0.0082 45.0 5535 0.0070 0.9975 0.7024 0.8551 0.5960
0.0082 46.0 5658 0.0071 0.9975 0.7110 0.8311 0.6212
0.0082 47.0 5781 0.0070 0.9975 0.7293 0.8049 0.6667
0.0082 48.0 5904 0.0068 0.9975 0.7317 0.7895 0.6818
0.0078 49.0 6027 0.0066 0.9976 0.7360 0.8291 0.6616
0.0078 50.0 6150 0.0086 0.9969 0.6192 0.8 0.5051
0.0078 51.0 6273 0.0066 0.9976 0.7466 0.8107 0.6919
0.0078 52.0 6396 0.0066 0.9975 0.7332 0.7861 0.6869
0.0073 53.0 6519 0.0087 0.9971 0.6761 0.7643 0.6061
0.0073 54.0 6642 0.0061 0.9976 0.7339 0.8239 0.6616
0.0073 55.0 6765 0.0063 0.9975 0.7351 0.7907 0.6869
0.0073 56.0 6888 0.0059 0.9977 0.7479 0.8282 0.6818
0.007 57.0 7011 0.0059 0.9977 0.7437 0.8408 0.6667
0.007 58.0 7134 0.0058 0.9977 0.7479 0.8516 0.6667
0.007 59.0 7257 0.0091 0.9973 0.6969 0.7935 0.6212
0.007 60.0 7380 0.0058 0.9979 0.7705 0.8393 0.7121
0.0069 61.0 7503 0.0055 0.9979 0.7742 0.8276 0.7273
0.0069 62.0 7626 0.0054 0.9979 0.7726 0.8443 0.7121
0.0069 63.0 7749 0.0062 0.9976 0.7368 0.8160 0.6717
0.0069 64.0 7872 0.0053 0.9979 0.7828 0.8343 0.7374
0.0069 65.0 7995 0.0051 0.9979 0.7738 0.8402 0.7172
0.0061 66.0 8118 0.0048 0.9981 0.7956 0.8639 0.7374
0.0061 67.0 8241 0.0051 0.9980 0.7892 0.8488 0.7374
0.0061 68.0 8364 0.0052 0.9980 0.8 0.8235 0.7778
0.0061 69.0 8487 0.0048 0.9981 0.8011 0.8698 0.7424
0.0055 70.0 8610 0.0048 0.9982 0.8128 0.8636 0.7677
0.0055 71.0 8733 0.0048 0.9979 0.7874 0.8197 0.7576
0.0055 72.0 8856 0.0044 0.9983 0.8207 0.8882 0.7626
0.0055 73.0 8979 0.0047 0.9981 0.8063 0.8370 0.7778
0.0051 74.0 9102 0.0045 0.9982 0.8196 0.8368 0.8030
0.0051 75.0 9225 0.0046 0.9982 0.8130 0.8772 0.7576
0.0051 76.0 9348 0.0045 0.9982 0.8158 0.8516 0.7828
0.0051 77.0 9471 0.0042 0.9983 0.8213 0.8701 0.7778
0.0051 78.0 9594 0.0049 0.9981 0.7989 0.8389 0.7626
0.0051 79.0 9717 0.0043 0.9983 0.8320 0.8519 0.8131
0.0051 80.0 9840 0.0051 0.9981 0.8032 0.8613 0.7525
0.0051 81.0 9963 0.0040 0.9984 0.8325 0.8641 0.8030
0.0049 82.0 10086 0.0040 0.9986 0.8527 0.8730 0.8333
0.0049 83.0 10209 0.0039 0.9986 0.8571 0.8824 0.8333
0.0049 84.0 10332 0.0037 0.9986 0.8549 0.8777 0.8333
0.0049 85.0 10455 0.0037 0.9986 0.8519 0.8944 0.8131
0.0043 86.0 10578 0.0037 0.9988 0.8794 0.875 0.8838
0.0043 87.0 10701 0.0037 0.9987 0.8722 0.8657 0.8788
0.0043 88.0 10824 0.0036 0.9986 0.8629 0.8673 0.8586
0.0043 89.0 10947 0.0036 0.9987 0.8724 0.8814 0.8636
0.004 90.0 11070 0.0037 0.9985 0.8483 0.8639 0.8333
0.004 91.0 11193 0.0033 0.9988 0.8747 0.8860 0.8636
0.004 92.0 11316 0.0034 0.9986 0.8586 0.8913 0.8283
0.004 93.0 11439 0.0037 0.9987 0.8744 0.87 0.8788
0.0041 94.0 11562 0.0034 0.9987 0.8675 0.8930 0.8434
0.0041 95.0 11685 0.0032 0.9989 0.8928 0.8818 0.9040
0.0041 96.0 11808 0.0032 0.9988 0.8832 0.8878 0.8788
0.0041 97.0 11931 0.0034 0.9987 0.8667 0.8802 0.8535
0.0036 98.0 12054 0.0033 0.9988 0.8804 0.8872 0.8737
0.0036 99.0 12177 0.0031 0.9989 0.8889 0.8889 0.8889
0.0036 100.0 12300 0.0030 0.9989 0.8945 0.89 0.8990
0.0036 101.0 12423 0.0029 0.9989 0.8917 0.8894 0.8939
0.0036 102.0 12546 0.0030 0.9989 0.8878 0.8768 0.8990
0.0036 103.0 12669 0.0029 0.9990 0.8995 0.895 0.9040
0.0036 104.0 12792 0.0029 0.9989 0.8894 0.885 0.8939
0.0036 105.0 12915 0.0029 0.9988 0.8747 0.8860 0.8636
0.0033 106.0 13038 0.0029 0.9990 0.9 0.8911 0.9091
0.0033 107.0 13161 0.0026 0.9990 0.8951 0.9067 0.8838
0.0033 108.0 13284 0.0038 0.9987 0.8651 0.8718 0.8586
0.0033 109.0 13407 0.0032 0.9988 0.8766 0.8744 0.8788
0.0033 110.0 13530 0.0028 0.9990 0.9018 0.8995 0.9040
0.0033 111.0 13653 0.0032 0.9987 0.8744 0.87 0.8788
0.0033 112.0 13776 0.0028 0.9990 0.9003 0.9119 0.8889
0.0033 113.0 13899 0.0029 0.9990 0.8985 0.9031 0.8939
0.0035 114.0 14022 0.0026 0.9991 0.9086 0.9133 0.9040
0.0035 115.0 14145 0.0026 0.9991 0.9132 0.8976 0.9293
0.0035 116.0 14268 0.0024 0.9991 0.9154 0.9020 0.9293
0.0035 117.0 14391 0.0024 0.9991 0.9118 0.9095 0.9141
0.0027 118.0 14514 0.0024 0.9992 0.9177 0.9064 0.9293
0.0027 119.0 14637 0.0023 0.9991 0.9123 0.9055 0.9192
0.0027 120.0 14760 0.0024 0.9990 0.8957 0.9026 0.8889
0.0027 121.0 14883 0.0023 0.9991 0.915 0.9059 0.9242
0.0027 122.0 15006 0.0023 0.9991 0.9132 0.8976 0.9293
0.0027 123.0 15129 0.0023 0.9992 0.9165 0.9188 0.9141
0.0027 124.0 15252 0.0028 0.9989 0.8911 0.8738 0.9091
0.0027 125.0 15375 0.0022 0.9992 0.9177 0.9064 0.9293
0.0027 126.0 15498 0.0023 0.9991 0.9114 0.9137 0.9091
0.0027 127.0 15621 0.0022 0.9992 0.92 0.9109 0.9293
0.0027 128.0 15744 0.0021 0.9992 0.9227 0.9113 0.9343
0.0027 129.0 15867 0.0022 0.9991 0.9068 0.9045 0.9091
0.0027 130.0 15990 0.0021 0.9992 0.9204 0.9069 0.9343
0.0025 131.0 16113 0.0021 0.9992 0.9181 0.9024 0.9343
0.0025 132.0 16236 0.0023 0.9991 0.9086 0.8889 0.9293
0.0025 133.0 16359 0.0020 0.9992 0.9208 0.9029 0.9394
0.0025 134.0 16482 0.0019 0.9992 0.9246 0.92 0.9293
0.0025 135.0 16605 0.0019 0.9993 0.9270 0.9246 0.9293
0.0025 136.0 16728 0.0019 0.9992 0.9231 0.9073 0.9394
0.0025 137.0 16851 0.0018 0.9993 0.9277 0.9163 0.9394
0.0025 138.0 16974 0.0019 0.9992 0.9177 0.9064 0.9293
0.0022 139.0 17097 0.0018 0.9992 0.9211 0.9282 0.9141
0.0022 140.0 17220 0.0018 0.9993 0.9323 0.9254 0.9394
0.0022 141.0 17343 0.0018 0.9993 0.935 0.9257 0.9444
0.0022 142.0 17466 0.0019 0.9992 0.9203 0.9372 0.9040
0.0021 143.0 17589 0.0017 0.9993 0.9289 0.9337 0.9242
0.0021 144.0 17712 0.0021 0.9992 0.9223 0.9154 0.9293
0.0021 145.0 17835 0.0021 0.9992 0.9165 0.9188 0.9141
0.0021 146.0 17958 0.0019 0.9993 0.9277 0.9163 0.9394
0.0023 147.0 18081 0.0019 0.9993 0.93 0.9208 0.9394
0.0023 148.0 18204 0.0017 0.9994 0.9373 0.9303 0.9444
0.0023 149.0 18327 0.0017 0.9993 0.9273 0.9204 0.9343
0.0023 150.0 18450 0.0018 0.9993 0.9306 0.9476 0.9141
0.002 151.0 18573 0.0032 0.9993 0.9286 0.9381 0.9192
0.002 152.0 18696 0.0016 0.9993 0.9340 0.9388 0.9293
0.002 153.0 18819 0.0017 0.9993 0.9323 0.9254 0.9394
0.002 154.0 18942 0.0017 0.9994 0.9364 0.9436 0.9293
0.0021 155.0 19065 0.0019 0.9993 0.9273 0.9204 0.9343
0.0021 156.0 19188 0.0016 0.9995 0.9460 0.9634 0.9293
0.0021 157.0 19311 0.0015 0.9995 0.9490 0.9588 0.9394
0.0021 158.0 19434 0.0015 0.9994 0.9442 0.9490 0.9394
0.0017 159.0 19557 0.0019 0.9993 0.9267 0.9620 0.8939
0.0017 160.0 19680 0.0014 0.9994 0.9418 0.9442 0.9394
0.0017 161.0 19803 0.0016 0.9994 0.9388 0.9485 0.9293
0.0017 162.0 19926 0.0014 0.9995 0.9460 0.9634 0.9293
0.0017 163.0 20049 0.0014 0.9995 0.9490 0.9588 0.9394
0.0017 164.0 20172 0.0013 0.9996 0.9567 0.9641 0.9495
0.0017 165.0 20295 0.0013 0.9994 0.9444 0.9444 0.9444
0.0017 166.0 20418 0.0014 0.9995 0.9519 0.9543 0.9495
0.0016 167.0 20541 0.0014 0.9994 0.9444 0.9444 0.9444
0.0016 168.0 20664 0.0013 0.9995 0.9460 0.9634 0.9293
0.0016 169.0 20787 0.0012 0.9996 0.9618 0.9692 0.9545
0.0016 170.0 20910 0.0013 0.9995 0.9487 0.9635 0.9343
0.0015 171.0 21033 0.0012 0.9996 0.9592 0.9691 0.9495
0.0015 172.0 21156 0.0013 0.9995 0.9455 0.9733 0.9192
0.0015 173.0 21279 0.0015 0.9995 0.9538 0.9688 0.9394
0.0015 174.0 21402 0.0012 0.9996 0.9620 0.9645 0.9596
0.0015 175.0 21525 0.0011 0.9996 0.9567 0.9641 0.9495
0.0015 176.0 21648 0.0011 0.9996 0.9648 0.96 0.9697
0.0015 177.0 21771 0.0011 0.9996 0.9622 0.9598 0.9646
0.0015 178.0 21894 0.0011 0.9996 0.9648 0.96 0.9697
0.0014 179.0 22017 0.0011 0.9995 0.9541 0.9639 0.9444
0.0014 180.0 22140 0.0011 0.9997 0.9671 0.9695 0.9646
0.0014 181.0 22263 0.0010 0.9996 0.9643 0.9742 0.9545
0.0014 182.0 22386 0.0010 0.9997 0.9694 0.9794 0.9596
0.0013 183.0 22509 0.0010 0.9997 0.9747 0.9747 0.9747
0.0013 184.0 22632 0.0011 0.9996 0.9622 0.9598 0.9646
0.0013 185.0 22755 0.0010 0.9997 0.9747 0.9747 0.9747
0.0013 186.0 22878 0.0010 0.9998 0.9772 0.9797 0.9747
0.0012 187.0 23001 0.0010 0.9998 0.9772 0.9797 0.9747
0.0012 188.0 23124 0.0009 0.9997 0.9746 0.9796 0.9697
0.0012 189.0 23247 0.0009 0.9997 0.9746 0.9796 0.9697
0.0012 190.0 23370 0.0009 0.9997 0.9746 0.9796 0.9697
0.0012 191.0 23493 0.0009 0.9997 0.9746 0.9796 0.9697
0.0012 192.0 23616 0.0009 0.9997 0.9695 0.9745 0.9646
0.0012 193.0 23739 0.0009 0.9997 0.9747 0.9747 0.9747
0.0012 194.0 23862 0.0009 0.9997 0.9722 0.9746 0.9697
0.0012 195.0 23985 0.0009 0.9998 0.9772 0.9797 0.9747
0.0012 196.0 24108 0.0009 0.9997 0.9747 0.9747 0.9747
0.0012 197.0 24231 0.0009 0.9997 0.9747 0.9747 0.9747
0.0012 198.0 24354 0.0009 0.9997 0.9747 0.9747 0.9747
0.0012 199.0 24477 0.0009 0.9997 0.9747 0.9747 0.9747
0.0011 200.0 24600 0.0009 0.9997 0.9747 0.9747 0.9747

Framework versions

  • Transformers 4.39.2
  • Pytorch 2.2.2+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
0
Safetensors
Model size
185M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for MikeZQZ/added_token_resized_model_test

Finetuned
(183)
this model