jordyvl's picture
Saving best model to hub
9d5fc68
metadata
license: apache-2.0
base_model: WinKawaks/vit-small-patch16-224
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: dit-base_tobacco-small_tobacco3482_og_simkd
    results: []

dit-base_tobacco-small_tobacco3482_og_simkd

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 309.8690
  • Accuracy: 0.815
  • Brier Loss: 0.3313
  • Nll: 1.1190
  • F1 Micro: 0.815
  • F1 Macro: 0.7825
  • Ece: 0.2569
  • Aurc: 0.0659

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 50 328.2257 0.365 0.8441 5.5835 0.3650 0.2447 0.3371 0.4390
No log 2.0 100 325.5961 0.58 0.6442 1.8499 0.58 0.4727 0.3233 0.2414
No log 3.0 150 323.3813 0.66 0.4759 1.5815 0.66 0.5348 0.2704 0.1489
No log 4.0 200 322.5013 0.715 0.4234 1.6240 0.715 0.6009 0.2382 0.1142
No log 5.0 250 321.7315 0.755 0.3532 1.2868 0.755 0.6596 0.2141 0.0687
No log 6.0 300 320.5884 0.775 0.3668 1.4106 0.775 0.7233 0.2284 0.0922
No log 7.0 350 320.8456 0.775 0.3638 1.4833 0.775 0.7172 0.2487 0.0666
No log 8.0 400 319.6829 0.785 0.3308 1.3914 0.785 0.7203 0.1959 0.0674
No log 9.0 450 319.7741 0.815 0.3459 1.3920 0.815 0.7832 0.2541 0.0681
325.907 10.0 500 319.4605 0.775 0.3162 1.2997 0.775 0.6987 0.2140 0.0575
325.907 11.0 550 318.6996 0.81 0.3190 1.2271 0.81 0.7670 0.2110 0.0614
325.907 12.0 600 318.0233 0.81 0.3183 1.2432 0.81 0.7673 0.2134 0.0624
325.907 13.0 650 318.2606 0.79 0.3259 1.2187 0.79 0.7457 0.2299 0.0591
325.907 14.0 700 317.7428 0.83 0.3183 1.3279 0.83 0.8035 0.2449 0.0512
325.907 15.0 750 317.7053 0.81 0.3251 1.2097 0.81 0.7604 0.2193 0.0566
325.907 16.0 800 317.3470 0.84 0.3142 1.2606 0.8400 0.8132 0.2272 0.0484
325.907 17.0 850 316.8029 0.815 0.3202 1.1571 0.815 0.7748 0.2316 0.0563
325.907 18.0 900 316.9777 0.805 0.3442 1.1453 0.805 0.7645 0.2432 0.0625
325.907 19.0 950 316.2359 0.815 0.3219 1.1399 0.815 0.7717 0.2404 0.0603
320.1296 20.0 1000 316.1051 0.8 0.3220 1.1807 0.8000 0.7500 0.2412 0.0576
320.1296 21.0 1050 315.8117 0.845 0.3099 1.0976 0.845 0.8084 0.2530 0.0547
320.1296 22.0 1100 315.7457 0.82 0.3238 1.1904 0.82 0.7663 0.2507 0.0548
320.1296 23.0 1150 315.6591 0.82 0.3357 1.4044 0.82 0.7925 0.2639 0.0586
320.1296 24.0 1200 315.4048 0.82 0.3270 1.0817 0.82 0.7681 0.2575 0.0629
320.1296 25.0 1250 314.9790 0.81 0.3309 1.2002 0.81 0.7732 0.2279 0.0656
320.1296 26.0 1300 314.6778 0.79 0.3189 1.1219 0.79 0.7464 0.2014 0.0576
320.1296 27.0 1350 314.7844 0.8 0.3345 1.0655 0.8000 0.7555 0.2398 0.0661
320.1296 28.0 1400 314.4464 0.815 0.3175 1.1116 0.815 0.7636 0.2426 0.0532
320.1296 29.0 1450 314.3737 0.845 0.3271 1.1042 0.845 0.8072 0.2595 0.0531
317.1926 30.0 1500 313.9464 0.82 0.3225 1.1270 0.82 0.7841 0.2087 0.0609
317.1926 31.0 1550 314.0068 0.835 0.3187 1.1834 0.835 0.8070 0.2470 0.0522
317.1926 32.0 1600 313.8198 0.81 0.3271 1.0324 0.81 0.7642 0.2484 0.0609
317.1926 33.0 1650 313.7599 0.83 0.3193 1.0993 0.83 0.7910 0.2382 0.0536
317.1926 34.0 1700 313.4889 0.82 0.3224 1.1743 0.82 0.7823 0.2587 0.0546
317.1926 35.0 1750 313.2496 0.825 0.3324 1.1041 0.825 0.7988 0.2404 0.0652
317.1926 36.0 1800 313.1823 0.83 0.3207 1.0900 0.83 0.8007 0.2505 0.0581
317.1926 37.0 1850 313.1304 0.83 0.3367 1.2073 0.83 0.7973 0.2615 0.0571
317.1926 38.0 1900 313.2971 0.815 0.3398 1.1045 0.815 0.7709 0.2411 0.0641
317.1926 39.0 1950 313.0526 0.815 0.3352 1.1023 0.815 0.7744 0.2455 0.0616
315.1897 40.0 2000 312.7858 0.84 0.3231 1.0983 0.8400 0.8096 0.2619 0.0538
315.1897 41.0 2050 312.5119 0.815 0.3290 1.1174 0.815 0.7858 0.2540 0.0604
315.1897 42.0 2100 312.5961 0.82 0.3305 1.2144 0.82 0.7787 0.2480 0.0572
315.1897 43.0 2150 312.3510 0.825 0.3357 1.1367 0.825 0.7936 0.2398 0.0658
315.1897 44.0 2200 312.4015 0.81 0.3303 1.1015 0.81 0.7837 0.2488 0.0598
315.1897 45.0 2250 312.2003 0.825 0.3286 1.1810 0.825 0.7953 0.2480 0.0614
315.1897 46.0 2300 312.1683 0.825 0.3283 1.1112 0.825 0.7881 0.2414 0.0587
315.1897 47.0 2350 312.2554 0.815 0.3433 1.1313 0.815 0.7709 0.2579 0.0694
315.1897 48.0 2400 312.0919 0.825 0.3364 1.1074 0.825 0.7963 0.2471 0.0636
315.1897 49.0 2450 312.0760 0.82 0.3412 1.1076 0.82 0.7859 0.2554 0.0661
313.7276 50.0 2500 311.7450 0.83 0.3245 1.1723 0.83 0.7994 0.2512 0.0558
313.7276 51.0 2550 311.5801 0.835 0.3236 1.1056 0.835 0.7954 0.2581 0.0576
313.7276 52.0 2600 311.7016 0.83 0.3235 1.1182 0.83 0.7988 0.2462 0.0560
313.7276 53.0 2650 311.0808 0.81 0.3308 1.0526 0.81 0.7716 0.2401 0.0687
313.7276 54.0 2700 311.3835 0.81 0.3304 1.1210 0.81 0.7803 0.2442 0.0604
313.7276 55.0 2750 311.1007 0.825 0.3285 1.1265 0.825 0.7931 0.2569 0.0639
313.7276 56.0 2800 311.3446 0.81 0.3273 1.1896 0.81 0.7810 0.2342 0.0622
313.7276 57.0 2850 311.0753 0.825 0.3327 1.1225 0.825 0.7929 0.2659 0.0668
313.7276 58.0 2900 311.3600 0.825 0.3320 1.1142 0.825 0.8000 0.2524 0.0640
313.7276 59.0 2950 310.8636 0.83 0.3242 1.1157 0.83 0.8022 0.2416 0.0633
312.6368 60.0 3000 310.7809 0.815 0.3386 1.2166 0.815 0.7820 0.2571 0.0702
312.6368 61.0 3050 310.9625 0.825 0.3273 1.1168 0.825 0.7923 0.2362 0.0608
312.6368 62.0 3100 311.1122 0.81 0.3369 1.1021 0.81 0.7700 0.2433 0.0633
312.6368 63.0 3150 311.1530 0.82 0.3351 1.1108 0.82 0.7780 0.2584 0.0615
312.6368 64.0 3200 310.9366 0.8 0.3288 1.1112 0.8000 0.7616 0.2545 0.0609
312.6368 65.0 3250 310.7639 0.82 0.3379 1.0992 0.82 0.7898 0.2407 0.0710
312.6368 66.0 3300 310.5876 0.81 0.3287 1.1197 0.81 0.7763 0.2270 0.0654
312.6368 67.0 3350 310.7344 0.805 0.3387 1.1279 0.805 0.7646 0.2354 0.0679
312.6368 68.0 3400 310.2750 0.825 0.3323 1.1367 0.825 0.7971 0.2514 0.0673
312.6368 69.0 3450 310.5080 0.815 0.3298 1.1049 0.815 0.7845 0.2329 0.0664
311.7616 70.0 3500 310.6353 0.81 0.3305 1.1098 0.81 0.7745 0.2346 0.0633
311.7616 71.0 3550 310.3249 0.825 0.3286 1.1117 0.825 0.7951 0.2455 0.0641
311.7616 72.0 3600 310.5689 0.825 0.3248 1.1079 0.825 0.7911 0.2388 0.0586
311.7616 73.0 3650 310.4175 0.82 0.3298 1.1169 0.82 0.7859 0.2338 0.0630
311.7616 74.0 3700 310.1338 0.815 0.3313 1.1236 0.815 0.7902 0.2558 0.0677
311.7616 75.0 3750 310.4428 0.825 0.3310 1.1269 0.825 0.7972 0.2458 0.0606
311.7616 76.0 3800 310.3477 0.81 0.3317 1.1060 0.81 0.7775 0.2392 0.0654
311.7616 77.0 3850 310.2144 0.815 0.3294 1.1076 0.815 0.7857 0.2387 0.0627
311.7616 78.0 3900 310.1073 0.82 0.3296 1.1246 0.82 0.7993 0.2496 0.0634
311.7616 79.0 3950 310.1449 0.805 0.3246 1.1134 0.805 0.7734 0.2277 0.0627
311.1587 80.0 4000 310.1684 0.81 0.3327 1.1094 0.81 0.7781 0.2493 0.0660
311.1587 81.0 4050 310.1772 0.815 0.3311 1.1129 0.815 0.7876 0.2447 0.0668
311.1587 82.0 4100 309.9326 0.805 0.3295 1.1172 0.805 0.7716 0.2508 0.0666
311.1587 83.0 4150 310.1067 0.805 0.3330 1.1209 0.805 0.7756 0.2252 0.0653
311.1587 84.0 4200 309.9362 0.825 0.3288 1.1150 0.825 0.8024 0.2500 0.0637
311.1587 85.0 4250 309.6593 0.81 0.3292 1.1226 0.81 0.7723 0.2306 0.0680
311.1587 86.0 4300 309.9828 0.8 0.3310 1.1252 0.8000 0.7643 0.2474 0.0662
311.1587 87.0 4350 310.0325 0.825 0.3322 1.1136 0.825 0.7935 0.2633 0.0634
311.1587 88.0 4400 309.8688 0.815 0.3320 1.1145 0.815 0.7824 0.2478 0.0675
311.1587 89.0 4450 310.0577 0.81 0.3324 1.1160 0.81 0.7810 0.2475 0.0648
310.732 90.0 4500 309.8999 0.81 0.3273 1.1120 0.81 0.7720 0.2356 0.0624
310.732 91.0 4550 309.7399 0.815 0.3256 1.1164 0.815 0.7824 0.2502 0.0649
310.732 92.0 4600 309.9419 0.805 0.3287 1.1183 0.805 0.7751 0.2353 0.0640
310.732 93.0 4650 309.9055 0.81 0.3268 1.1194 0.81 0.7761 0.2429 0.0613
310.732 94.0 4700 309.7320 0.82 0.3275 1.1117 0.82 0.7914 0.2408 0.0654
310.732 95.0 4750 309.9635 0.81 0.3334 1.1067 0.81 0.7747 0.2317 0.0637
310.732 96.0 4800 309.9630 0.805 0.3304 1.1165 0.805 0.7712 0.2316 0.0631
310.732 97.0 4850 309.8564 0.815 0.3263 1.1130 0.815 0.7870 0.2355 0.0619
310.732 98.0 4900 309.7815 0.815 0.3298 1.1198 0.815 0.7857 0.2386 0.0634
310.732 99.0 4950 309.8337 0.81 0.3354 1.0806 0.81 0.7818 0.2480 0.0672
310.5225 100.0 5000 309.8690 0.815 0.3313 1.1190 0.815 0.7825 0.2569 0.0659

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1