Edit model card

dit-base_tobacco-tiny_tobacco3482_og_simkd

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 318.4368
  • Accuracy: 0.805
  • Brier Loss: 0.3825
  • Nll: 1.1523
  • F1 Micro: 0.805
  • F1 Macro: 0.7673
  • Ece: 0.2987
  • Aurc: 0.0702

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 328.9614 0.155 0.8984 7.4608 0.155 0.0353 0.2035 0.8760
No log 2.0 14 328.8199 0.235 0.8940 6.4907 0.235 0.1148 0.2643 0.7444
No log 3.0 21 328.4224 0.38 0.8711 2.8184 0.38 0.3279 0.3440 0.4817
No log 4.0 28 327.5357 0.51 0.8072 2.0744 0.51 0.4221 0.4111 0.3319
No log 5.0 35 326.2037 0.53 0.6860 2.0669 0.53 0.4313 0.3619 0.2744
No log 6.0 42 324.8763 0.565 0.6008 1.9437 0.565 0.4477 0.3009 0.2469
No log 7.0 49 323.9205 0.6 0.5390 1.7694 0.6 0.4647 0.2365 0.1978
No log 8.0 56 323.2227 0.65 0.4632 1.7803 0.65 0.5195 0.2313 0.1422
No log 9.0 63 322.5265 0.74 0.4177 1.7538 0.74 0.6302 0.2442 0.1113
No log 10.0 70 322.1928 0.705 0.4013 1.5880 0.705 0.5864 0.2147 0.1118
No log 11.0 77 322.2687 0.795 0.4006 1.2854 0.795 0.7476 0.2719 0.0942
No log 12.0 84 321.6652 0.725 0.3754 1.3462 0.7250 0.6521 0.2238 0.0920
No log 13.0 91 322.3688 0.785 0.3951 1.3209 0.785 0.7260 0.2712 0.0805
No log 14.0 98 321.7083 0.72 0.3915 1.4854 0.72 0.6220 0.1963 0.0986
No log 15.0 105 321.6171 0.8 0.3614 1.3397 0.8000 0.7427 0.2531 0.0741
No log 16.0 112 321.0427 0.77 0.3502 1.1461 0.7700 0.7082 0.1976 0.0769
No log 17.0 119 321.1529 0.735 0.3827 1.5751 0.735 0.6769 0.1926 0.0973
No log 18.0 126 321.0808 0.78 0.3611 1.2529 0.78 0.7199 0.2242 0.0762
No log 19.0 133 321.6684 0.795 0.3835 1.1789 0.795 0.7506 0.2823 0.0712
No log 20.0 140 321.2322 0.78 0.3682 1.1715 0.78 0.7356 0.2532 0.0752
No log 21.0 147 320.4927 0.795 0.3458 1.3764 0.795 0.7504 0.2178 0.0710
No log 22.0 154 320.8896 0.8 0.3568 1.0908 0.8000 0.7536 0.2709 0.0677
No log 23.0 161 320.9060 0.785 0.3774 1.1571 0.785 0.7414 0.2712 0.0719
No log 24.0 168 320.9026 0.795 0.3718 1.0871 0.795 0.7465 0.2718 0.0690
No log 25.0 175 320.7932 0.805 0.3601 1.0998 0.805 0.7699 0.2620 0.0614
No log 26.0 182 321.2285 0.735 0.4164 1.8530 0.735 0.7051 0.2814 0.0889
No log 27.0 189 320.8364 0.775 0.4028 1.4063 0.775 0.7412 0.2687 0.0836
No log 28.0 196 320.0800 0.785 0.3548 1.2123 0.785 0.7394 0.2055 0.0740
No log 29.0 203 319.9995 0.79 0.3526 1.2296 0.79 0.7381 0.2363 0.0691
No log 30.0 210 320.0685 0.795 0.3588 1.2765 0.795 0.7447 0.2310 0.0725
No log 31.0 217 320.0981 0.805 0.3699 1.0128 0.805 0.7690 0.2868 0.0701
No log 32.0 224 320.5063 0.8 0.3900 1.1437 0.8000 0.7650 0.3141 0.0679
No log 33.0 231 319.8609 0.795 0.3549 1.2051 0.795 0.7526 0.2485 0.0697
No log 34.0 238 319.6974 0.81 0.3600 1.0124 0.81 0.7724 0.2671 0.0672
No log 35.0 245 319.5988 0.795 0.3513 1.1480 0.795 0.7540 0.2425 0.0679
No log 36.0 252 319.6317 0.8 0.3544 1.2190 0.8000 0.7607 0.2449 0.0674
No log 37.0 259 319.6821 0.81 0.3531 1.0714 0.81 0.7672 0.2590 0.0662
No log 38.0 266 319.7618 0.805 0.3754 1.0421 0.805 0.7625 0.2973 0.0701
No log 39.0 273 319.9920 0.775 0.3843 1.0821 0.775 0.7374 0.2801 0.0723
No log 40.0 280 319.3407 0.765 0.3633 1.2213 0.765 0.7041 0.2274 0.0767
No log 41.0 287 319.2732 0.765 0.3696 1.2638 0.765 0.7184 0.2315 0.0835
No log 42.0 294 319.5948 0.805 0.3685 1.0782 0.805 0.7625 0.2678 0.0661
No log 43.0 301 319.7181 0.8 0.3776 1.0004 0.8000 0.7507 0.2598 0.0672
No log 44.0 308 319.1170 0.77 0.3619 1.2129 0.7700 0.7159 0.2557 0.0787
No log 45.0 315 319.5949 0.8 0.3809 1.1448 0.8000 0.7670 0.2868 0.0688
No log 46.0 322 319.0327 0.79 0.3675 1.2386 0.79 0.7315 0.2546 0.0790
No log 47.0 329 319.3806 0.805 0.3665 1.1368 0.805 0.7620 0.2737 0.0700
No log 48.0 336 319.4999 0.795 0.3836 1.0256 0.795 0.7550 0.2800 0.0748
No log 49.0 343 319.2553 0.8 0.3660 1.2011 0.8000 0.7573 0.2698 0.0679
No log 50.0 350 319.3495 0.805 0.3836 1.1055 0.805 0.7634 0.3004 0.0671
No log 51.0 357 319.1643 0.8 0.3660 1.1980 0.8000 0.7497 0.2641 0.0709
No log 52.0 364 319.1483 0.795 0.3651 1.0776 0.795 0.7561 0.2856 0.0683
No log 53.0 371 319.0104 0.79 0.3724 1.1653 0.79 0.7422 0.2512 0.0724
No log 54.0 378 319.1622 0.795 0.3814 1.2807 0.795 0.7456 0.2644 0.0759
No log 55.0 385 319.1554 0.8 0.3694 1.2710 0.8000 0.7570 0.2877 0.0667
No log 56.0 392 319.2158 0.79 0.3795 1.1678 0.79 0.7509 0.2942 0.0692
No log 57.0 399 319.1813 0.795 0.3839 1.1243 0.795 0.7529 0.2835 0.0733
No log 58.0 406 318.7599 0.81 0.3632 1.1484 0.81 0.7738 0.3030 0.0691
No log 59.0 413 319.0827 0.805 0.3792 1.2070 0.805 0.7685 0.2901 0.0674
No log 60.0 420 318.6928 0.805 0.3661 1.1517 0.805 0.7534 0.2492 0.0719
No log 61.0 427 318.8309 0.805 0.3714 1.2785 0.805 0.7517 0.2674 0.0699
No log 62.0 434 318.9468 0.8 0.3794 1.1549 0.8000 0.7566 0.2862 0.0707
No log 63.0 441 318.8059 0.785 0.3774 1.2460 0.785 0.7487 0.2721 0.0752
No log 64.0 448 318.7155 0.81 0.3659 1.1963 0.81 0.7660 0.2676 0.0680
No log 65.0 455 318.8439 0.795 0.3799 1.0230 0.795 0.7464 0.2797 0.0700
No log 66.0 462 318.7784 0.79 0.3783 1.3168 0.79 0.7503 0.2618 0.0804
No log 67.0 469 318.9019 0.795 0.3802 1.2003 0.795 0.7503 0.2934 0.0702
No log 68.0 476 318.6647 0.8 0.3728 1.1395 0.8000 0.7590 0.2718 0.0699
No log 69.0 483 318.3780 0.8 0.3688 1.2812 0.8000 0.7602 0.2690 0.0728
No log 70.0 490 318.8004 0.8 0.3779 1.0682 0.8000 0.7607 0.2887 0.0682
No log 71.0 497 318.7021 0.8 0.3748 1.1101 0.8000 0.7545 0.2977 0.0691
322.4844 72.0 504 318.3595 0.79 0.3779 1.2333 0.79 0.7386 0.2617 0.0843
322.4844 73.0 511 318.5725 0.805 0.3740 1.2108 0.805 0.7674 0.2762 0.0677
322.4844 74.0 518 318.7131 0.81 0.3822 1.2048 0.81 0.7660 0.2971 0.0696
322.4844 75.0 525 318.6258 0.775 0.3806 1.1511 0.775 0.7228 0.2824 0.0743
322.4844 76.0 532 318.5414 0.8 0.3746 1.2136 0.8000 0.7563 0.2872 0.0708
322.4844 77.0 539 318.5404 0.795 0.3765 1.1414 0.795 0.7551 0.2905 0.0707
322.4844 78.0 546 318.5820 0.8 0.3806 1.1653 0.8000 0.7573 0.2888 0.0707
322.4844 79.0 553 318.5909 0.8 0.3838 1.2343 0.8000 0.7563 0.2778 0.0754
322.4844 80.0 560 318.6398 0.795 0.3874 1.1097 0.795 0.7520 0.3045 0.0727
322.4844 81.0 567 318.6250 0.795 0.3860 1.1612 0.795 0.7542 0.3079 0.0727
322.4844 82.0 574 318.5269 0.795 0.3825 1.2812 0.795 0.7451 0.2723 0.0737
322.4844 83.0 581 318.5790 0.795 0.3846 1.1575 0.795 0.7455 0.2984 0.0723
322.4844 84.0 588 318.4343 0.795 0.3826 1.2088 0.795 0.7532 0.2852 0.0746
322.4844 85.0 595 318.3853 0.795 0.3792 1.2784 0.795 0.7456 0.3003 0.0729
322.4844 86.0 602 318.5143 0.805 0.3854 1.1745 0.805 0.7636 0.3071 0.0705
322.4844 87.0 609 318.3533 0.805 0.3763 1.1579 0.805 0.7679 0.2805 0.0694
322.4844 88.0 616 318.4745 0.795 0.3860 1.0964 0.795 0.7539 0.2952 0.0712
322.4844 89.0 623 318.4909 0.805 0.3829 1.1544 0.805 0.7673 0.3035 0.0700
322.4844 90.0 630 318.4910 0.8 0.3828 1.1537 0.8000 0.7497 0.2730 0.0717
322.4844 91.0 637 318.5176 0.8 0.3855 1.1613 0.8000 0.7552 0.2815 0.0718
322.4844 92.0 644 318.4100 0.795 0.3810 1.2215 0.795 0.7532 0.2696 0.0731
322.4844 93.0 651 318.3500 0.805 0.3765 1.2181 0.805 0.7702 0.2790 0.0705
322.4844 94.0 658 318.3257 0.805 0.3785 1.2218 0.805 0.7678 0.3114 0.0704
322.4844 95.0 665 318.3990 0.8 0.3823 1.1485 0.8000 0.7585 0.2901 0.0710
322.4844 96.0 672 318.5006 0.81 0.3862 1.1518 0.81 0.7724 0.2925 0.0698
322.4844 97.0 679 318.3142 0.8 0.3780 1.1608 0.8000 0.7557 0.2916 0.0716
322.4844 98.0 686 318.3767 0.795 0.3819 1.2208 0.795 0.7526 0.2764 0.0731
322.4844 99.0 693 318.4233 0.8 0.3810 1.1532 0.8000 0.7557 0.2786 0.0706
322.4844 100.0 700 318.4368 0.805 0.3825 1.1523 0.805 0.7673 0.2987 0.0702

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
4
Safetensors
Model size
7.23M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for jordyvl/dit-base_tobacco-tiny_tobacco3482_og_simkd

Finetuned
this model