metadata
library_name: transformers
license: other
base_model: apple/mobilevit-small
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: MobileViT_Food_80epoch
results: []
MobileViT_Food_80epoch
This model is a fine-tuned version of apple/mobilevit-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7769
- Accuracy: 0.8053
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 80
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
4.5902 | 0.9994 | 1183 | 4.5818 | 0.0286 |
4.2708 | 1.9996 | 2367 | 4.2247 | 0.1690 |
3.7077 | 2.9998 | 3551 | 3.5174 | 0.2602 |
3.271 | 4.0 | 4735 | 2.9216 | 0.3432 |
2.8193 | 4.9994 | 5918 | 2.4241 | 0.4276 |
2.4733 | 5.9996 | 7102 | 2.0284 | 0.5017 |
2.1674 | 6.9998 | 8286 | 1.7180 | 0.5674 |
1.9884 | 8.0 | 9470 | 1.5144 | 0.6122 |
1.7582 | 8.9994 | 10653 | 1.3711 | 0.6450 |
1.4781 | 9.9996 | 11837 | 1.2530 | 0.6689 |
1.6275 | 10.9998 | 13021 | 1.1598 | 0.6924 |
1.5292 | 12.0 | 14205 | 1.1260 | 0.7046 |
1.3675 | 12.9994 | 15388 | 1.0912 | 0.7122 |
1.3782 | 13.9996 | 16572 | 1.0276 | 0.7255 |
1.3084 | 14.9998 | 17756 | 1.0042 | 0.7345 |
1.1715 | 16.0 | 18940 | 0.9771 | 0.7427 |
1.2386 | 16.9994 | 20123 | 0.9601 | 0.7461 |
1.1787 | 17.9996 | 21307 | 0.9489 | 0.7472 |
1.1716 | 18.9998 | 22491 | 0.9360 | 0.7516 |
1.1363 | 20.0 | 23675 | 0.9129 | 0.7595 |
1.2677 | 20.9994 | 24858 | 0.9007 | 0.7633 |
1.2019 | 21.9996 | 26042 | 0.8869 | 0.7657 |
1.0633 | 22.9998 | 27226 | 0.8835 | 0.7656 |
1.0393 | 24.0 | 28410 | 0.8742 | 0.7693 |
0.9558 | 24.9994 | 29593 | 0.8704 | 0.7705 |
1.0596 | 25.9996 | 30777 | 0.8455 | 0.7764 |
1.0749 | 26.9998 | 31961 | 0.8431 | 0.7793 |
0.9913 | 28.0 | 33145 | 0.8332 | 0.7795 |
0.9477 | 28.9994 | 34328 | 0.8434 | 0.7777 |
0.9681 | 29.9996 | 35512 | 0.8215 | 0.7840 |
0.9356 | 30.9998 | 36696 | 0.8050 | 0.7888 |
0.806 | 32.0 | 37880 | 0.8152 | 0.7870 |
1.0011 | 32.9994 | 39063 | 0.8089 | 0.7843 |
0.9268 | 33.9996 | 40247 | 0.8018 | 0.7884 |
0.8209 | 34.9998 | 41431 | 0.8147 | 0.7876 |
0.8193 | 36.0 | 42615 | 0.8043 | 0.7893 |
0.8523 | 36.9994 | 43798 | 0.8014 | 0.7893 |
0.9134 | 37.9996 | 44982 | 0.7995 | 0.7895 |
0.9263 | 38.9998 | 46166 | 0.7928 | 0.7896 |
0.9393 | 40.0 | 47350 | 0.7951 | 0.7952 |
0.8028 | 40.9994 | 48533 | 0.7840 | 0.7967 |
0.8299 | 41.9996 | 49717 | 0.7994 | 0.7929 |
0.791 | 42.9998 | 50901 | 0.7873 | 0.7921 |
0.8739 | 44.0 | 52085 | 0.7869 | 0.7956 |
0.8777 | 44.9994 | 53268 | 0.7835 | 0.7952 |
0.8077 | 45.9996 | 54452 | 0.7815 | 0.7957 |
0.9119 | 46.9998 | 55636 | 0.7753 | 0.7984 |
0.9867 | 48.0 | 56820 | 0.7824 | 0.7969 |
0.8115 | 48.9994 | 58003 | 0.7852 | 0.7975 |
0.779 | 49.9996 | 59187 | 0.7815 | 0.7992 |
0.755 | 50.9998 | 60371 | 0.7796 | 0.8011 |
0.7529 | 52.0 | 61555 | 0.7739 | 0.8014 |
0.6878 | 52.9994 | 62738 | 0.7914 | 0.7989 |
0.744 | 53.9996 | 63922 | 0.7774 | 0.8002 |
0.7346 | 54.9998 | 65106 | 0.7679 | 0.8012 |
0.7672 | 56.0 | 66290 | 0.7696 | 0.7998 |
0.8018 | 56.9994 | 67473 | 0.7877 | 0.7987 |
0.7507 | 57.9996 | 68657 | 0.7903 | 0.7979 |
0.7632 | 58.9998 | 69841 | 0.7831 | 0.8010 |
0.7013 | 60.0 | 71025 | 0.7799 | 0.7985 |
0.7364 | 60.9994 | 72208 | 0.7527 | 0.8079 |
0.8036 | 61.9996 | 73392 | 0.7664 | 0.8010 |
0.74 | 62.9998 | 74576 | 0.7683 | 0.8022 |
0.6531 | 64.0 | 75760 | 0.7548 | 0.8021 |
0.7375 | 64.9994 | 76943 | 0.7623 | 0.8022 |
0.7228 | 65.9996 | 78127 | 0.7820 | 0.8028 |
0.7318 | 66.9998 | 79311 | 0.7625 | 0.8008 |
0.6529 | 68.0 | 80495 | 0.7693 | 0.8036 |
0.68 | 68.9994 | 81678 | 0.7371 | 0.8093 |
0.7396 | 69.9996 | 82862 | 0.7699 | 0.8040 |
0.7388 | 70.9998 | 84046 | 0.7596 | 0.8038 |
0.7135 | 72.0 | 85230 | 0.7607 | 0.8043 |
0.6667 | 72.9994 | 86413 | 0.7666 | 0.8034 |
0.6866 | 73.9996 | 87597 | 0.7640 | 0.8046 |
0.6601 | 74.9998 | 88781 | 0.7573 | 0.8037 |
0.7305 | 76.0 | 89965 | 0.7443 | 0.8094 |
0.7507 | 76.9994 | 91148 | 0.7636 | 0.8053 |
0.7073 | 77.9996 | 92332 | 0.7692 | 0.8033 |
0.688 | 78.9998 | 93516 | 0.7609 | 0.8044 |
0.6694 | 79.9493 | 94640 | 0.7769 | 0.8053 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.1