Edit model card

dit-base_tobacco-tiny_tobacco3482_kd_MSE

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0108
  • Accuracy: 0.815
  • Brier Loss: 0.2593
  • Nll: 1.1011
  • F1 Micro: 0.815
  • F1 Macro: 0.8014
  • Ece: 0.1462
  • Aurc: 0.0442

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 7.1562 0.195 0.9158 7.6908 0.195 0.1043 0.2991 0.7927
No log 2.0 14 6.3485 0.245 0.8600 4.2261 0.245 0.1794 0.2960 0.7445
No log 3.0 21 5.2488 0.41 0.7090 3.6293 0.41 0.3494 0.2826 0.3786
No log 4.0 28 4.0484 0.565 0.5698 1.7703 0.565 0.5209 0.2622 0.2372
No log 5.0 35 3.0368 0.655 0.4710 1.5921 0.655 0.6368 0.2222 0.1598
No log 6.0 42 2.6191 0.695 0.4219 1.6919 0.695 0.6535 0.2041 0.1200
No log 7.0 49 2.0941 0.725 0.3913 1.3852 0.7250 0.6844 0.2046 0.0966
No log 8.0 56 2.0668 0.725 0.4119 1.3829 0.7250 0.6811 0.1890 0.1045
No log 9.0 63 1.7456 0.79 0.3138 1.5258 0.79 0.7539 0.1521 0.0651
No log 10.0 70 1.5815 0.77 0.3391 1.2461 0.7700 0.7323 0.1593 0.0725
No log 11.0 77 1.5720 0.785 0.2895 1.3282 0.785 0.7659 0.1408 0.0522
No log 12.0 84 1.8886 0.78 0.3692 1.6238 0.78 0.7717 0.2015 0.0917
No log 13.0 91 1.6164 0.785 0.2918 1.6303 0.785 0.7925 0.1545 0.0564
No log 14.0 98 1.4318 0.785 0.3220 1.3070 0.785 0.7606 0.1430 0.0639
No log 15.0 105 1.2774 0.81 0.2807 1.2877 0.81 0.7939 0.1532 0.0595
No log 16.0 112 1.3797 0.8 0.2993 1.2409 0.8000 0.7759 0.1565 0.0700
No log 17.0 119 1.3629 0.795 0.3091 1.1781 0.795 0.7670 0.1712 0.0567
No log 18.0 126 1.5101 0.8 0.3192 1.3586 0.8000 0.7878 0.1919 0.0665
No log 19.0 133 1.3897 0.805 0.2857 1.4983 0.805 0.7851 0.1356 0.0516
No log 20.0 140 1.3821 0.795 0.3204 1.0916 0.795 0.7745 0.1678 0.0651
No log 21.0 147 1.2852 0.83 0.2621 1.5182 0.83 0.8246 0.1483 0.0486
No log 22.0 154 1.2080 0.815 0.2744 1.1921 0.815 0.7957 0.1319 0.0500
No log 23.0 161 1.4016 0.805 0.3165 1.3364 0.805 0.7844 0.1534 0.0624
No log 24.0 168 1.2883 0.825 0.2592 1.4946 0.825 0.8119 0.1549 0.0481
No log 25.0 175 1.1715 0.815 0.2676 1.3363 0.815 0.8054 0.1464 0.0494
No log 26.0 182 1.1844 0.825 0.2585 1.4938 0.825 0.8045 0.1572 0.0469
No log 27.0 189 1.1739 0.81 0.2959 1.0692 0.81 0.7909 0.1625 0.0550
No log 28.0 196 1.1944 0.815 0.2891 1.1811 0.815 0.7971 0.1430 0.0572
No log 29.0 203 1.2115 0.83 0.2597 1.4809 0.83 0.8101 0.1289 0.0469
No log 30.0 210 1.1622 0.81 0.2825 1.1104 0.81 0.7931 0.1463 0.0511
No log 31.0 217 1.2591 0.8 0.3096 1.2310 0.8000 0.7789 0.1719 0.0591
No log 32.0 224 1.1752 0.82 0.2687 1.4091 0.82 0.7959 0.1581 0.0504
No log 33.0 231 1.1114 0.815 0.2719 1.0945 0.815 0.7885 0.1492 0.0485
No log 34.0 238 1.1105 0.815 0.2727 1.1239 0.815 0.7962 0.1300 0.0479
No log 35.0 245 1.1662 0.825 0.2748 1.3396 0.825 0.8100 0.1571 0.0554
No log 36.0 252 1.1023 0.815 0.2757 1.1805 0.815 0.8031 0.1428 0.0504
No log 37.0 259 1.1060 0.84 0.2604 1.3305 0.8400 0.8319 0.1596 0.0487
No log 38.0 266 1.1123 0.81 0.2682 1.1122 0.81 0.7922 0.1310 0.0482
No log 39.0 273 1.0820 0.815 0.2669 1.1629 0.815 0.7955 0.1479 0.0490
No log 40.0 280 1.0972 0.805 0.2784 1.2442 0.805 0.7858 0.1576 0.0483
No log 41.0 287 1.0845 0.83 0.2705 1.1180 0.83 0.8221 0.1504 0.0468
No log 42.0 294 1.0769 0.82 0.2602 1.1173 0.82 0.8066 0.1458 0.0451
No log 43.0 301 1.1366 0.81 0.2939 1.0722 0.81 0.7958 0.1532 0.0526
No log 44.0 308 1.0716 0.82 0.2635 1.1839 0.82 0.8043 0.1403 0.0451
No log 45.0 315 1.0865 0.81 0.2770 1.3595 0.81 0.7929 0.1501 0.0528
No log 46.0 322 1.0768 0.82 0.2638 1.1161 0.82 0.8067 0.1462 0.0457
No log 47.0 329 1.0644 0.825 0.2552 1.2086 0.825 0.8098 0.1579 0.0439
No log 48.0 336 1.0511 0.815 0.2656 1.1019 0.815 0.8014 0.1518 0.0471
No log 49.0 343 1.0517 0.82 0.2717 1.0881 0.82 0.8044 0.1559 0.0473
No log 50.0 350 1.0824 0.81 0.2813 1.1022 0.81 0.7968 0.1538 0.0505
No log 51.0 357 1.1439 0.835 0.2634 1.3483 0.835 0.8206 0.1471 0.0496
No log 52.0 364 1.0444 0.83 0.2500 1.0999 0.83 0.8156 0.1310 0.0423
No log 53.0 371 1.0426 0.825 0.2644 1.1112 0.825 0.8053 0.1295 0.0474
No log 54.0 378 1.0341 0.825 0.2635 1.1053 0.825 0.8092 0.1467 0.0465
No log 55.0 385 1.0900 0.815 0.2762 1.1021 0.815 0.7990 0.1439 0.0480
No log 56.0 392 1.0423 0.845 0.2517 1.2594 0.845 0.8444 0.1497 0.0428
No log 57.0 399 1.0246 0.825 0.2634 1.0927 0.825 0.8130 0.1260 0.0454
No log 58.0 406 1.0365 0.835 0.2649 1.0825 0.835 0.8232 0.1291 0.0448
No log 59.0 413 1.0394 0.82 0.2668 1.0968 0.82 0.8045 0.1458 0.0460
No log 60.0 420 1.0261 0.815 0.2720 1.0883 0.815 0.8011 0.1409 0.0472
No log 61.0 427 1.0503 0.83 0.2543 1.3230 0.83 0.8132 0.1378 0.0455
No log 62.0 434 1.0400 0.82 0.2637 1.0958 0.82 0.8043 0.1397 0.0456
No log 63.0 441 1.0338 0.82 0.2629 1.0960 0.82 0.8042 0.1338 0.0435
No log 64.0 448 1.0373 0.84 0.2508 1.2817 0.8400 0.8260 0.1325 0.0433
No log 65.0 455 1.0266 0.83 0.2663 1.1057 0.83 0.8163 0.1383 0.0460
No log 66.0 462 1.0303 0.825 0.2549 1.1906 0.825 0.8098 0.1399 0.0450
No log 67.0 469 1.0224 0.82 0.2668 1.0920 0.82 0.8042 0.1252 0.0433
No log 68.0 476 1.0274 0.845 0.2526 1.1948 0.845 0.8368 0.1423 0.0442
No log 69.0 483 1.0145 0.82 0.2647 1.0884 0.82 0.8070 0.1345 0.0449
No log 70.0 490 1.0194 0.815 0.2606 1.1076 0.815 0.8014 0.1529 0.0446
No log 71.0 497 1.0153 0.825 0.2572 1.2484 0.825 0.8142 0.1425 0.0445
0.6377 72.0 504 1.0265 0.815 0.2607 1.1109 0.815 0.8039 0.1457 0.0445
0.6377 73.0 511 1.0081 0.82 0.2567 1.1031 0.82 0.8040 0.1321 0.0440
0.6377 74.0 518 1.0135 0.825 0.2600 1.1036 0.825 0.8074 0.1477 0.0450
0.6377 75.0 525 1.0053 0.82 0.2616 1.1012 0.82 0.8044 0.1542 0.0442
0.6377 76.0 532 1.0187 0.82 0.2598 1.1115 0.82 0.8069 0.1566 0.0445
0.6377 77.0 539 1.0127 0.82 0.2610 1.1024 0.82 0.8097 0.1489 0.0443
0.6377 78.0 546 1.0079 0.82 0.2581 1.1034 0.82 0.8069 0.1463 0.0434
0.6377 79.0 553 1.0097 0.815 0.2592 1.1030 0.815 0.8014 0.1478 0.0438
0.6377 80.0 560 1.0131 0.835 0.2556 1.1048 0.835 0.8281 0.1508 0.0441
0.6377 81.0 567 1.0183 0.82 0.2602 1.1057 0.82 0.8044 0.1417 0.0446
0.6377 82.0 574 1.0190 0.815 0.2665 1.0966 0.815 0.7987 0.1370 0.0462
0.6377 83.0 581 1.0117 0.815 0.2619 1.0974 0.815 0.8014 0.1614 0.0442
0.6377 84.0 588 1.0099 0.82 0.2557 1.1070 0.82 0.8044 0.1327 0.0436
0.6377 85.0 595 1.0088 0.82 0.2569 1.1037 0.82 0.8044 0.1446 0.0437
0.6377 86.0 602 1.0110 0.82 0.2596 1.0945 0.82 0.8043 0.1505 0.0442
0.6377 87.0 609 1.0151 0.815 0.2606 1.1046 0.815 0.8014 0.1416 0.0451
0.6377 88.0 616 1.0101 0.815 0.2587 1.1025 0.815 0.8014 0.1435 0.0440
0.6377 89.0 623 1.0106 0.815 0.2613 1.0976 0.815 0.8014 0.1489 0.0443
0.6377 90.0 630 1.0097 0.815 0.2590 1.0993 0.815 0.8014 0.1490 0.0439
0.6377 91.0 637 1.0098 0.815 0.2593 1.1024 0.815 0.8014 0.1510 0.0440
0.6377 92.0 644 1.0116 0.815 0.2600 1.1004 0.815 0.8014 0.1465 0.0442
0.6377 93.0 651 1.0107 0.815 0.2596 1.1005 0.815 0.8014 0.1548 0.0442
0.6377 94.0 658 1.0110 0.815 0.2599 1.0993 0.815 0.8014 0.1463 0.0440
0.6377 95.0 665 1.0106 0.815 0.2593 1.1011 0.815 0.8014 0.1409 0.0441
0.6377 96.0 672 1.0106 0.815 0.2596 1.1011 0.815 0.8014 0.1496 0.0442
0.6377 97.0 679 1.0109 0.815 0.2595 1.1007 0.815 0.8014 0.1462 0.0442
0.6377 98.0 686 1.0107 0.815 0.2593 1.1013 0.815 0.8014 0.1409 0.0441
0.6377 99.0 693 1.0107 0.815 0.2594 1.1009 0.815 0.8014 0.1462 0.0441
0.6377 100.0 700 1.0108 0.815 0.2593 1.1011 0.815 0.8014 0.1462 0.0442

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
0
Safetensors
Model size
5.53M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-tiny_tobacco3482_kd_MSE

Finetuned
(13)
this model