dit-base_tobacco-small_tobacco3482_hint
This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.9099
- Accuracy: 0.85
- Brier Loss: 0.2772
- Nll: 1.4757
- F1 Micro: 0.85
- F1 Macro: 0.8366
- Ece: 0.1392
- Aurc: 0.0460
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 50 | 3.2233 | 0.44 | 0.7001 | 2.8339 | 0.44 | 0.3067 | 0.2724 | 0.3661 |
No log | 2.0 | 100 | 2.3954 | 0.705 | 0.4016 | 1.5814 | 0.705 | 0.6657 | 0.2046 | 0.1093 |
No log | 3.0 | 150 | 2.1938 | 0.735 | 0.3560 | 1.5685 | 0.735 | 0.7026 | 0.1879 | 0.0858 |
No log | 4.0 | 200 | 2.0989 | 0.74 | 0.3533 | 1.5416 | 0.74 | 0.7058 | 0.2015 | 0.0896 |
No log | 5.0 | 250 | 2.0203 | 0.795 | 0.3169 | 1.5407 | 0.795 | 0.7861 | 0.1773 | 0.0919 |
No log | 6.0 | 300 | 2.1849 | 0.675 | 0.4531 | 1.6333 | 0.675 | 0.6701 | 0.2207 | 0.1166 |
No log | 7.0 | 350 | 2.2223 | 0.745 | 0.4113 | 1.4333 | 0.745 | 0.7293 | 0.2045 | 0.0980 |
No log | 8.0 | 400 | 2.1696 | 0.715 | 0.4221 | 1.6537 | 0.715 | 0.6723 | 0.2069 | 0.1040 |
No log | 9.0 | 450 | 2.4443 | 0.735 | 0.4291 | 1.5392 | 0.735 | 0.7458 | 0.2236 | 0.1323 |
1.8536 | 10.0 | 500 | 2.0474 | 0.775 | 0.3649 | 1.6156 | 0.775 | 0.7528 | 0.1915 | 0.0844 |
1.8536 | 11.0 | 550 | 2.0046 | 0.81 | 0.3170 | 1.6225 | 0.81 | 0.7920 | 0.1547 | 0.0639 |
1.8536 | 12.0 | 600 | 2.4864 | 0.725 | 0.4602 | 1.5678 | 0.7250 | 0.7308 | 0.2415 | 0.1218 |
1.8536 | 13.0 | 650 | 1.8413 | 0.83 | 0.2698 | 1.6361 | 0.83 | 0.8117 | 0.1349 | 0.0674 |
1.8536 | 14.0 | 700 | 2.1304 | 0.815 | 0.3281 | 1.5685 | 0.815 | 0.7936 | 0.1715 | 0.0703 |
1.8536 | 15.0 | 750 | 2.5075 | 0.71 | 0.4652 | 1.9297 | 0.7100 | 0.6877 | 0.2281 | 0.1099 |
1.8536 | 16.0 | 800 | 2.4854 | 0.73 | 0.4462 | 1.5241 | 0.7300 | 0.7176 | 0.2282 | 0.1097 |
1.8536 | 17.0 | 850 | 2.1252 | 0.805 | 0.3210 | 1.5685 | 0.805 | 0.7907 | 0.1650 | 0.0804 |
1.8536 | 18.0 | 900 | 1.9249 | 0.86 | 0.2473 | 1.7031 | 0.8600 | 0.8689 | 0.1244 | 0.0528 |
1.8536 | 19.0 | 950 | 2.0943 | 0.835 | 0.2840 | 1.4696 | 0.835 | 0.8267 | 0.1439 | 0.0652 |
1.0941 | 20.0 | 1000 | 1.8548 | 0.845 | 0.2566 | 1.3059 | 0.845 | 0.8403 | 0.1333 | 0.0558 |
1.0941 | 21.0 | 1050 | 2.1487 | 0.805 | 0.3362 | 1.4556 | 0.805 | 0.8051 | 0.1665 | 0.0764 |
1.0941 | 22.0 | 1100 | 2.2147 | 0.81 | 0.3149 | 1.4884 | 0.81 | 0.8081 | 0.1710 | 0.0984 |
1.0941 | 23.0 | 1150 | 2.1111 | 0.84 | 0.2898 | 1.5426 | 0.8400 | 0.8410 | 0.1489 | 0.0848 |
1.0941 | 24.0 | 1200 | 2.2432 | 0.85 | 0.2884 | 1.7273 | 0.85 | 0.8482 | 0.1532 | 0.0765 |
1.0941 | 25.0 | 1250 | 2.3105 | 0.75 | 0.4190 | 1.4648 | 0.75 | 0.7396 | 0.2177 | 0.1074 |
1.0941 | 26.0 | 1300 | 2.0587 | 0.795 | 0.3444 | 1.6181 | 0.795 | 0.7960 | 0.1641 | 0.0799 |
1.0941 | 27.0 | 1350 | 2.4465 | 0.8 | 0.3517 | 2.0076 | 0.8000 | 0.7770 | 0.1731 | 0.0849 |
1.0941 | 28.0 | 1400 | 2.1351 | 0.825 | 0.3132 | 1.5650 | 0.825 | 0.8315 | 0.1631 | 0.0553 |
1.0941 | 29.0 | 1450 | 1.9746 | 0.86 | 0.2451 | 1.5908 | 0.8600 | 0.8374 | 0.1267 | 0.0537 |
0.9575 | 30.0 | 1500 | 2.0257 | 0.855 | 0.2737 | 1.6541 | 0.855 | 0.8121 | 0.1352 | 0.0480 |
0.9575 | 31.0 | 1550 | 1.9631 | 0.84 | 0.3037 | 1.7341 | 0.8400 | 0.8201 | 0.1515 | 0.0423 |
0.9575 | 32.0 | 1600 | 2.4215 | 0.785 | 0.3909 | 1.4042 | 0.785 | 0.7740 | 0.2018 | 0.0708 |
0.9575 | 33.0 | 1650 | 2.2159 | 0.795 | 0.3492 | 1.7639 | 0.795 | 0.7716 | 0.1721 | 0.0537 |
0.9575 | 34.0 | 1700 | 2.3363 | 0.82 | 0.3132 | 1.9858 | 0.82 | 0.7993 | 0.1610 | 0.0845 |
0.9575 | 35.0 | 1750 | 2.2187 | 0.84 | 0.2884 | 1.5376 | 0.8400 | 0.8182 | 0.1523 | 0.0803 |
0.9575 | 36.0 | 1800 | 2.3407 | 0.825 | 0.3206 | 1.8292 | 0.825 | 0.8028 | 0.1588 | 0.0719 |
0.9575 | 37.0 | 1850 | 2.4302 | 0.815 | 0.3353 | 1.7611 | 0.815 | 0.8091 | 0.1654 | 0.0920 |
0.9575 | 38.0 | 1900 | 2.3307 | 0.815 | 0.3269 | 1.8263 | 0.815 | 0.8043 | 0.1675 | 0.0876 |
0.9575 | 39.0 | 1950 | 2.2905 | 0.825 | 0.3217 | 1.7612 | 0.825 | 0.8116 | 0.1639 | 0.0841 |
0.8923 | 40.0 | 2000 | 2.2699 | 0.83 | 0.3225 | 1.7537 | 0.83 | 0.8186 | 0.1655 | 0.0792 |
0.8923 | 41.0 | 2050 | 2.2327 | 0.83 | 0.3179 | 1.7534 | 0.83 | 0.8186 | 0.1559 | 0.0764 |
0.8923 | 42.0 | 2100 | 2.2852 | 0.825 | 0.3230 | 1.6737 | 0.825 | 0.8150 | 0.1611 | 0.0760 |
0.8923 | 43.0 | 2150 | 2.2597 | 0.825 | 0.3221 | 1.6727 | 0.825 | 0.8147 | 0.1610 | 0.0734 |
0.8923 | 44.0 | 2200 | 2.2492 | 0.83 | 0.3176 | 1.6692 | 0.83 | 0.8169 | 0.1619 | 0.0720 |
0.8923 | 45.0 | 2250 | 2.2208 | 0.825 | 0.3182 | 1.6737 | 0.825 | 0.8124 | 0.1627 | 0.0707 |
0.8923 | 46.0 | 2300 | 2.2192 | 0.825 | 0.3209 | 1.6771 | 0.825 | 0.8121 | 0.1650 | 0.0712 |
0.8923 | 47.0 | 2350 | 2.2127 | 0.825 | 0.3198 | 1.6187 | 0.825 | 0.8124 | 0.1636 | 0.0684 |
0.8923 | 48.0 | 2400 | 2.2079 | 0.825 | 0.3208 | 1.6760 | 0.825 | 0.8121 | 0.1632 | 0.0707 |
0.8923 | 49.0 | 2450 | 2.1995 | 0.825 | 0.3187 | 1.5377 | 0.825 | 0.8124 | 0.1656 | 0.0702 |
0.8511 | 50.0 | 2500 | 2.1877 | 0.825 | 0.3158 | 1.6098 | 0.825 | 0.8124 | 0.1600 | 0.0690 |
0.8511 | 51.0 | 2550 | 2.1698 | 0.825 | 0.3167 | 1.5353 | 0.825 | 0.8124 | 0.1607 | 0.0695 |
0.8511 | 52.0 | 2600 | 2.1667 | 0.825 | 0.3133 | 1.5303 | 0.825 | 0.8121 | 0.1596 | 0.0680 |
0.8511 | 53.0 | 2650 | 2.1791 | 0.83 | 0.3170 | 1.5332 | 0.83 | 0.8149 | 0.1608 | 0.0690 |
0.8511 | 54.0 | 2700 | 2.1621 | 0.83 | 0.3148 | 1.5274 | 0.83 | 0.8146 | 0.1551 | 0.0693 |
0.8511 | 55.0 | 2750 | 2.1572 | 0.83 | 0.3119 | 1.5318 | 0.83 | 0.8149 | 0.1532 | 0.0680 |
0.8511 | 56.0 | 2800 | 2.1587 | 0.83 | 0.3100 | 1.5232 | 0.83 | 0.8148 | 0.1524 | 0.0712 |
0.8511 | 57.0 | 2850 | 2.1596 | 0.83 | 0.3101 | 1.5234 | 0.83 | 0.8146 | 0.1560 | 0.0696 |
0.8511 | 58.0 | 2900 | 2.1048 | 0.835 | 0.3047 | 1.5231 | 0.835 | 0.8189 | 0.1442 | 0.0676 |
0.8511 | 59.0 | 2950 | 2.4279 | 0.76 | 0.4096 | 1.4535 | 0.76 | 0.7538 | 0.2078 | 0.0731 |
0.8335 | 60.0 | 3000 | 2.2098 | 0.775 | 0.4036 | 1.4180 | 0.775 | 0.7565 | 0.2010 | 0.0870 |
0.8335 | 61.0 | 3050 | 2.0122 | 0.85 | 0.2596 | 1.5903 | 0.85 | 0.8272 | 0.1349 | 0.0779 |
0.8335 | 62.0 | 3100 | 2.2465 | 0.815 | 0.3311 | 1.6852 | 0.815 | 0.7899 | 0.1672 | 0.0658 |
0.8335 | 63.0 | 3150 | 2.1239 | 0.84 | 0.2963 | 1.6390 | 0.8400 | 0.8305 | 0.1458 | 0.0878 |
0.8335 | 64.0 | 3200 | 2.1931 | 0.82 | 0.3181 | 1.7037 | 0.82 | 0.8199 | 0.1654 | 0.0719 |
0.8335 | 65.0 | 3250 | 1.8262 | 0.855 | 0.2493 | 1.4845 | 0.855 | 0.8335 | 0.1297 | 0.0456 |
0.8335 | 66.0 | 3300 | 1.9467 | 0.845 | 0.2657 | 1.4217 | 0.845 | 0.8326 | 0.1361 | 0.0498 |
0.8335 | 67.0 | 3350 | 1.9371 | 0.85 | 0.2680 | 1.4175 | 0.85 | 0.8405 | 0.1293 | 0.0506 |
0.8335 | 68.0 | 3400 | 1.9172 | 0.85 | 0.2656 | 1.4203 | 0.85 | 0.8405 | 0.1331 | 0.0503 |
0.8335 | 69.0 | 3450 | 1.8872 | 0.845 | 0.2664 | 1.4327 | 0.845 | 0.8324 | 0.1360 | 0.0493 |
0.8281 | 70.0 | 3500 | 1.9045 | 0.845 | 0.2715 | 1.4920 | 0.845 | 0.8324 | 0.1377 | 0.0496 |
0.8281 | 71.0 | 3550 | 1.8954 | 0.845 | 0.2684 | 1.4919 | 0.845 | 0.8338 | 0.1385 | 0.0499 |
0.8281 | 72.0 | 3600 | 1.9222 | 0.85 | 0.2698 | 1.4870 | 0.85 | 0.8375 | 0.1356 | 0.0499 |
0.8281 | 73.0 | 3650 | 1.9004 | 0.845 | 0.2691 | 1.4912 | 0.845 | 0.8335 | 0.1377 | 0.0484 |
0.8281 | 74.0 | 3700 | 1.9168 | 0.85 | 0.2693 | 1.4903 | 0.85 | 0.8375 | 0.1338 | 0.0495 |
0.8281 | 75.0 | 3750 | 1.8970 | 0.85 | 0.2700 | 1.4908 | 0.85 | 0.8366 | 0.1416 | 0.0477 |
0.8281 | 76.0 | 3800 | 1.9089 | 0.85 | 0.2705 | 1.4867 | 0.85 | 0.8366 | 0.1373 | 0.0480 |
0.8281 | 77.0 | 3850 | 1.8902 | 0.85 | 0.2697 | 1.4896 | 0.85 | 0.8366 | 0.1407 | 0.0464 |
0.8281 | 78.0 | 3900 | 1.8889 | 0.85 | 0.2710 | 1.4882 | 0.85 | 0.8366 | 0.1421 | 0.0472 |
0.8281 | 79.0 | 3950 | 1.9080 | 0.85 | 0.2712 | 1.4876 | 0.85 | 0.8366 | 0.1345 | 0.0476 |
0.8047 | 80.0 | 4000 | 1.9011 | 0.85 | 0.2703 | 1.4864 | 0.85 | 0.8366 | 0.1373 | 0.0472 |
0.8047 | 81.0 | 4050 | 1.9112 | 0.85 | 0.2735 | 1.4867 | 0.85 | 0.8366 | 0.1379 | 0.0465 |
0.8047 | 82.0 | 4100 | 1.8850 | 0.85 | 0.2728 | 1.4872 | 0.85 | 0.8366 | 0.1419 | 0.0462 |
0.8047 | 83.0 | 4150 | 1.9074 | 0.85 | 0.2740 | 1.4862 | 0.85 | 0.8366 | 0.1369 | 0.0463 |
0.8047 | 84.0 | 4200 | 1.8804 | 0.85 | 0.2714 | 1.4818 | 0.85 | 0.8366 | 0.1376 | 0.0461 |
0.8047 | 85.0 | 4250 | 1.9092 | 0.85 | 0.2757 | 1.4825 | 0.85 | 0.8366 | 0.1437 | 0.0463 |
0.8047 | 86.0 | 4300 | 1.8985 | 0.85 | 0.2745 | 1.4827 | 0.85 | 0.8366 | 0.1390 | 0.0460 |
0.8047 | 87.0 | 4350 | 1.9091 | 0.85 | 0.2731 | 1.4808 | 0.85 | 0.8366 | 0.1403 | 0.0466 |
0.8047 | 88.0 | 4400 | 1.9037 | 0.85 | 0.2754 | 1.4836 | 0.85 | 0.8366 | 0.1383 | 0.0459 |
0.8047 | 89.0 | 4450 | 1.8950 | 0.85 | 0.2750 | 1.4798 | 0.85 | 0.8366 | 0.1386 | 0.0452 |
0.7971 | 90.0 | 4500 | 1.9115 | 0.85 | 0.2755 | 1.4785 | 0.85 | 0.8366 | 0.1387 | 0.0461 |
0.7971 | 91.0 | 4550 | 1.9061 | 0.85 | 0.2757 | 1.4791 | 0.85 | 0.8366 | 0.1451 | 0.0460 |
0.7971 | 92.0 | 4600 | 1.9058 | 0.85 | 0.2757 | 1.4785 | 0.85 | 0.8366 | 0.1392 | 0.0464 |
0.7971 | 93.0 | 4650 | 1.9128 | 0.85 | 0.2724 | 1.4769 | 0.85 | 0.8366 | 0.1341 | 0.0468 |
0.7971 | 94.0 | 4700 | 1.9115 | 0.85 | 0.2770 | 1.4771 | 0.85 | 0.8366 | 0.1388 | 0.0463 |
0.7971 | 95.0 | 4750 | 1.9097 | 0.85 | 0.2761 | 1.4761 | 0.85 | 0.8366 | 0.1382 | 0.0462 |
0.7971 | 96.0 | 4800 | 1.9025 | 0.85 | 0.2761 | 1.4759 | 0.85 | 0.8366 | 0.1385 | 0.0460 |
0.7971 | 97.0 | 4850 | 1.9153 | 0.85 | 0.2775 | 1.4757 | 0.85 | 0.8366 | 0.1394 | 0.0463 |
0.7971 | 98.0 | 4900 | 1.9084 | 0.85 | 0.2765 | 1.4755 | 0.85 | 0.8366 | 0.1388 | 0.0460 |
0.7971 | 99.0 | 4950 | 1.9087 | 0.85 | 0.2772 | 1.4757 | 0.85 | 0.8366 | 0.1392 | 0.0460 |
0.7931 | 100.0 | 5000 | 1.9099 | 0.85 | 0.2772 | 1.4757 | 0.85 | 0.8366 | 0.1392 | 0.0460 |
Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.2.0.dev20231112+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for jordyvl/dit-base_tobacco-small_tobacco3482_hint
Base model
WinKawaks/vit-small-patch16-224