Edit model card

dit-base_tobacco-tiny_tobacco3482_kd

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4787
  • Accuracy: 0.815
  • Brier Loss: 0.2625
  • Nll: 1.3204
  • F1 Micro: 0.815
  • F1 Macro: 0.8058
  • Ece: 0.1408
  • Aurc: 0.0457

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 2.2200 0.225 0.9173 7.9176 0.225 0.1043 0.3247 0.7528
No log 2.0 14 1.7791 0.37 0.7982 4.5330 0.37 0.2759 0.2753 0.6029
No log 3.0 21 1.3855 0.485 0.6628 2.8048 0.485 0.4006 0.2716 0.3216
No log 4.0 28 1.0594 0.575 0.5416 1.7471 0.575 0.5296 0.2314 0.2052
No log 5.0 35 0.8946 0.645 0.4739 1.6979 0.645 0.6319 0.2168 0.1476
No log 6.0 42 0.8969 0.66 0.4688 1.7181 0.66 0.6621 0.1985 0.1439
No log 7.0 49 0.7466 0.72 0.3937 1.5484 0.72 0.7222 0.2135 0.1079
No log 8.0 56 0.7075 0.725 0.3753 1.4585 0.7250 0.7100 0.2029 0.0894
No log 9.0 63 0.9838 0.695 0.4504 2.0243 0.695 0.6681 0.2306 0.1204
No log 10.0 70 0.7683 0.73 0.3867 1.5378 0.7300 0.7388 0.1699 0.0902
No log 11.0 77 0.7114 0.755 0.3524 1.5684 0.755 0.7296 0.1687 0.0742
No log 12.0 84 0.8151 0.76 0.3728 1.4721 0.76 0.7336 0.1863 0.0936
No log 13.0 91 0.8346 0.77 0.3524 1.9540 0.7700 0.7528 0.1691 0.0790
No log 14.0 98 0.7822 0.735 0.3898 1.6873 0.735 0.7215 0.2025 0.0860
No log 15.0 105 0.7400 0.765 0.3580 1.5692 0.765 0.7167 0.1765 0.0809
No log 16.0 112 0.8296 0.71 0.4027 1.5508 0.7100 0.6837 0.2185 0.0963
No log 17.0 119 0.6868 0.79 0.3443 1.4135 0.79 0.7528 0.1971 0.0748
No log 18.0 126 0.6290 0.795 0.3142 1.8030 0.795 0.7657 0.1476 0.0621
No log 19.0 133 0.6454 0.79 0.3213 1.6664 0.79 0.7785 0.1649 0.0645
No log 20.0 140 0.6089 0.785 0.3228 1.5436 0.785 0.7729 0.1708 0.0639
No log 21.0 147 0.6715 0.785 0.3289 1.3422 0.785 0.7598 0.1787 0.0768
No log 22.0 154 0.7075 0.79 0.3342 1.6069 0.79 0.7684 0.1587 0.0656
No log 23.0 161 0.6226 0.805 0.3067 1.2400 0.805 0.7881 0.1611 0.0716
No log 24.0 168 0.7501 0.77 0.3506 1.8952 0.7700 0.7530 0.1637 0.0746
No log 25.0 175 0.6039 0.775 0.3168 1.4196 0.775 0.7647 0.1701 0.0664
No log 26.0 182 0.6252 0.775 0.3260 1.4914 0.775 0.7507 0.1733 0.0657
No log 27.0 189 0.6590 0.79 0.3303 1.5970 0.79 0.7773 0.1695 0.0747
No log 28.0 196 0.5920 0.815 0.2988 1.6841 0.815 0.8127 0.1711 0.0635
No log 29.0 203 0.5982 0.785 0.3163 1.6290 0.785 0.7678 0.1641 0.0597
No log 30.0 210 0.5693 0.805 0.3028 1.4954 0.805 0.7917 0.1566 0.0578
No log 31.0 217 0.5860 0.805 0.2964 1.3856 0.805 0.7966 0.1413 0.0599
No log 32.0 224 0.5380 0.805 0.2775 1.6946 0.805 0.7981 0.1526 0.0494
No log 33.0 231 0.5041 0.8 0.2745 1.6025 0.8000 0.7887 0.1639 0.0498
No log 34.0 238 0.5134 0.83 0.2700 1.3768 0.83 0.8161 0.1464 0.0526
No log 35.0 245 0.5371 0.81 0.2820 1.3584 0.81 0.7982 0.1466 0.0552
No log 36.0 252 0.4987 0.815 0.2711 1.3735 0.815 0.8056 0.1540 0.0490
No log 37.0 259 0.5145 0.81 0.2814 1.3537 0.81 0.8000 0.1415 0.0521
No log 38.0 266 0.4992 0.815 0.2721 1.3420 0.815 0.7974 0.1453 0.0497
No log 39.0 273 0.4992 0.795 0.2748 1.3579 0.795 0.7757 0.1485 0.0502
No log 40.0 280 0.4881 0.82 0.2634 1.3745 0.82 0.8058 0.1504 0.0475
No log 41.0 287 0.4977 0.81 0.2750 1.3208 0.81 0.7965 0.1520 0.0504
No log 42.0 294 0.4865 0.815 0.2644 1.3840 0.815 0.8056 0.1517 0.0452
No log 43.0 301 0.5034 0.81 0.2722 1.3683 0.81 0.7967 0.1404 0.0514
No log 44.0 308 0.4925 0.815 0.2692 1.3979 0.815 0.8056 0.1386 0.0462
No log 45.0 315 0.4643 0.825 0.2608 1.3015 0.825 0.8148 0.1516 0.0442
No log 46.0 322 0.4851 0.82 0.2666 1.2561 0.82 0.8018 0.1494 0.0461
No log 47.0 329 0.4751 0.82 0.2615 1.3167 0.82 0.8120 0.1544 0.0457
No log 48.0 336 0.4666 0.82 0.2596 1.2470 0.82 0.8120 0.1326 0.0443
No log 49.0 343 0.4856 0.815 0.2659 1.3283 0.815 0.8081 0.1501 0.0474
No log 50.0 350 0.4690 0.83 0.2618 1.3227 0.83 0.8208 0.1448 0.0435
No log 51.0 357 0.4835 0.81 0.2670 1.2956 0.81 0.7961 0.1425 0.0471
No log 52.0 364 0.4857 0.82 0.2598 1.3115 0.82 0.8134 0.1547 0.0478
No log 53.0 371 0.4747 0.82 0.2654 1.3717 0.82 0.8026 0.1596 0.0453
No log 54.0 378 0.4925 0.815 0.2649 1.2289 0.815 0.8058 0.1452 0.0486
No log 55.0 385 0.4670 0.825 0.2611 1.3080 0.825 0.8102 0.1307 0.0434
No log 56.0 392 0.4878 0.81 0.2636 1.3040 0.81 0.7961 0.1546 0.0478
No log 57.0 399 0.4679 0.82 0.2600 1.2618 0.82 0.8038 0.1516 0.0430
No log 58.0 406 0.4802 0.815 0.2629 1.3054 0.815 0.8079 0.1448 0.0476
No log 59.0 413 0.4746 0.82 0.2615 1.3177 0.82 0.8064 0.1308 0.0457
No log 60.0 420 0.4784 0.82 0.2608 1.2495 0.82 0.8134 0.1336 0.0463
No log 61.0 427 0.4751 0.82 0.2630 1.2886 0.82 0.8086 0.1416 0.0459
No log 62.0 434 0.4751 0.815 0.2606 1.2453 0.815 0.8058 0.1529 0.0455
No log 63.0 441 0.4737 0.825 0.2629 1.2975 0.825 0.8113 0.1286 0.0451
No log 64.0 448 0.4840 0.815 0.2631 1.3210 0.815 0.8036 0.1392 0.0472
No log 65.0 455 0.4747 0.82 0.2615 1.3054 0.82 0.8086 0.1491 0.0456
No log 66.0 462 0.4767 0.815 0.2618 1.3056 0.815 0.8058 0.1517 0.0459
No log 67.0 469 0.4748 0.82 0.2615 1.3046 0.82 0.8086 0.1525 0.0453
No log 68.0 476 0.4782 0.815 0.2626 1.3088 0.815 0.8058 0.1519 0.0461
No log 69.0 483 0.4769 0.815 0.2616 1.3133 0.815 0.8058 0.1555 0.0456
No log 70.0 490 0.4767 0.815 0.2622 1.3067 0.815 0.8058 0.1435 0.0457
No log 71.0 497 0.4776 0.815 0.2623 1.3111 0.815 0.8058 0.1533 0.0458
0.1688 72.0 504 0.4770 0.815 0.2621 1.3078 0.815 0.8058 0.1605 0.0457
0.1688 73.0 511 0.4783 0.815 0.2625 1.3109 0.815 0.8058 0.1503 0.0458
0.1688 74.0 518 0.4776 0.815 0.2621 1.3117 0.815 0.8058 0.1648 0.0458
0.1688 75.0 525 0.4784 0.815 0.2627 1.3110 0.815 0.8058 0.1463 0.0458
0.1688 76.0 532 0.4779 0.815 0.2625 1.3125 0.815 0.8058 0.1577 0.0457
0.1688 77.0 539 0.4794 0.815 0.2628 1.3110 0.815 0.8058 0.1420 0.0459
0.1688 78.0 546 0.4776 0.815 0.2623 1.3120 0.815 0.8058 0.1517 0.0455
0.1688 79.0 553 0.4789 0.815 0.2627 1.3101 0.815 0.8058 0.1460 0.0459
0.1688 80.0 560 0.4784 0.815 0.2626 1.3127 0.815 0.8058 0.1518 0.0457
0.1688 81.0 567 0.4782 0.815 0.2625 1.3103 0.815 0.8058 0.1408 0.0457
0.1688 82.0 574 0.4791 0.815 0.2627 1.3166 0.815 0.8058 0.1586 0.0458
0.1688 83.0 581 0.4785 0.815 0.2625 1.3116 0.815 0.8058 0.1436 0.0459
0.1688 84.0 588 0.4783 0.815 0.2624 1.3113 0.815 0.8058 0.1476 0.0458
0.1688 85.0 595 0.4785 0.815 0.2625 1.3169 0.815 0.8058 0.1500 0.0457
0.1688 86.0 602 0.4782 0.815 0.2625 1.3127 0.815 0.8058 0.1496 0.0457
0.1688 87.0 609 0.4778 0.815 0.2623 1.3119 0.815 0.8058 0.1496 0.0456
0.1688 88.0 616 0.4783 0.815 0.2625 1.3118 0.815 0.8058 0.1529 0.0458
0.1688 89.0 623 0.4784 0.815 0.2625 1.3149 0.815 0.8058 0.1485 0.0457
0.1688 90.0 630 0.4781 0.815 0.2624 1.3137 0.815 0.8058 0.1472 0.0457
0.1688 91.0 637 0.4784 0.815 0.2626 1.3111 0.815 0.8058 0.1492 0.0458
0.1688 92.0 644 0.4785 0.815 0.2625 1.3177 0.815 0.8058 0.1485 0.0457
0.1688 93.0 651 0.4790 0.815 0.2626 1.3208 0.815 0.8058 0.1462 0.0457
0.1688 94.0 658 0.4788 0.815 0.2625 1.3178 0.815 0.8058 0.1396 0.0458
0.1688 95.0 665 0.4785 0.815 0.2625 1.3203 0.815 0.8058 0.1484 0.0457
0.1688 96.0 672 0.4786 0.815 0.2625 1.3168 0.815 0.8058 0.1470 0.0457
0.1688 97.0 679 0.4786 0.815 0.2625 1.3167 0.815 0.8058 0.1470 0.0457
0.1688 98.0 686 0.4787 0.815 0.2625 1.3192 0.815 0.8058 0.1408 0.0457
0.1688 99.0 693 0.4787 0.815 0.2625 1.3205 0.815 0.8058 0.1408 0.0457
0.1688 100.0 700 0.4787 0.815 0.2625 1.3204 0.815 0.8058 0.1408 0.0457

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
7
Safetensors
Model size
5.53M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-tiny_tobacco3482_kd

Finetuned
(13)
this model