itsLeen commited on
Commit
3c7a67c
1 Parent(s): 1450f88

Model save

Browse files
README.md CHANGED
@@ -3,7 +3,6 @@ library_name: transformers
3
  license: apache-2.0
4
  base_model: google/vit-base-patch16-224-in21k
5
  tags:
6
- - image-classification
7
  - generated_from_trainer
8
  metrics:
9
  - accuracy
@@ -17,10 +16,10 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # finetuned-fake-food
19
 
20
- This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the indian_food_images dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.3199
23
- - Accuracy: 0.8720
24
 
25
  ## Model description
26
 
@@ -45,38 +44,107 @@ The following hyperparameters were used during training:
45
  - seed: 42
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
- - training_steps: 2000
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
54
  |:-------------:|:------:|:----:|:---------------:|:--------:|
55
- | 0.5416 | 0.1264 | 100 | 0.5593 | 0.7081 |
56
- | 0.5299 | 0.2528 | 200 | 0.5342 | 0.7422 |
57
- | 0.5503 | 0.3793 | 300 | 0.4875 | 0.7717 |
58
- | 0.5561 | 0.5057 | 400 | 0.4622 | 0.7941 |
59
- | 0.5581 | 0.6321 | 500 | 0.5501 | 0.7457 |
60
- | 0.5845 | 0.7585 | 600 | 0.5088 | 0.7475 |
61
- | 0.5695 | 0.8850 | 700 | 0.4740 | 0.7860 |
62
- | 0.5406 | 1.0114 | 800 | 0.4856 | 0.7816 |
63
- | 0.5353 | 1.1378 | 900 | 0.4252 | 0.8156 |
64
- | 0.5345 | 1.2642 | 1000 | 0.5014 | 0.7762 |
65
- | 0.5105 | 1.3906 | 1100 | 0.4800 | 0.7860 |
66
- | 0.5266 | 1.5171 | 1200 | 0.4618 | 0.7959 |
67
- | 0.4709 | 1.6435 | 1300 | 0.3906 | 0.8281 |
68
- | 0.4624 | 1.7699 | 1400 | 0.4208 | 0.8129 |
69
- | 0.4677 | 1.8963 | 1500 | 0.4207 | 0.8174 |
70
- | 0.4478 | 2.0228 | 1600 | 0.3557 | 0.8478 |
71
- | 0.4451 | 2.1492 | 1700 | 0.3546 | 0.8442 |
72
- | 0.3796 | 2.2756 | 1800 | 0.3199 | 0.8720 |
73
- | 0.4358 | 2.4020 | 1900 | 0.3308 | 0.8603 |
74
- | 0.3373 | 2.5284 | 2000 | 0.3455 | 0.8541 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
75
 
76
 
77
  ### Framework versions
78
 
79
  - Transformers 4.44.2
80
  - Pytorch 2.4.1+cu121
81
- - Datasets 3.0.1
82
  - Tokenizers 0.19.1
 
3
  license: apache-2.0
4
  base_model: google/vit-base-patch16-224-in21k
5
  tags:
 
6
  - generated_from_trainer
7
  metrics:
8
  - accuracy
 
16
 
17
  # finetuned-fake-food
18
 
19
+ This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.2602
22
+ - Accuracy: 0.8879
23
 
24
  ## Model description
25
 
 
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
+ - training_steps: 9000
48
  - mixed_precision_training: Native AMP
49
 
50
  ### Training results
51
 
52
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
  |:-------------:|:------:|:----:|:---------------:|:--------:|
54
+ | 0.5991 | 0.0505 | 100 | 0.6129 | 0.7028 |
55
+ | 0.6593 | 0.1011 | 200 | 0.4338 | 0.8364 |
56
+ | 0.4908 | 0.1516 | 300 | 0.4490 | 0.8099 |
57
+ | 0.4756 | 0.2021 | 400 | 0.7639 | 0.7003 |
58
+ | 0.547 | 0.2527 | 500 | 0.4253 | 0.8335 |
59
+ | 0.4702 | 0.3032 | 600 | 0.3864 | 0.8446 |
60
+ | 0.5099 | 0.3537 | 700 | 0.4819 | 0.7755 |
61
+ | 0.5484 | 0.4042 | 800 | 0.3940 | 0.8264 |
62
+ | 0.6263 | 0.4548 | 900 | 0.6219 | 0.7118 |
63
+ | 0.5453 | 0.5053 | 1000 | 0.4548 | 0.7888 |
64
+ | 0.5431 | 0.5558 | 1100 | 0.4210 | 0.8084 |
65
+ | 0.5678 | 0.6064 | 1200 | 0.4946 | 0.8038 |
66
+ | 0.3266 | 0.6569 | 1300 | 0.4538 | 0.8264 |
67
+ | 0.4225 | 0.7074 | 1400 | 0.4366 | 0.8088 |
68
+ | 0.32 | 0.7580 | 1500 | 0.5586 | 0.7884 |
69
+ | 0.473 | 0.8085 | 1600 | 0.4805 | 0.7974 |
70
+ | 0.4557 | 0.8590 | 1700 | 0.3707 | 0.8371 |
71
+ | 0.408 | 0.9096 | 1800 | 0.4968 | 0.7999 |
72
+ | 0.4979 | 0.9601 | 1900 | 0.4432 | 0.7898 |
73
+ | 0.4115 | 1.0106 | 2000 | 0.3722 | 0.8392 |
74
+ | 0.3421 | 1.0611 | 2100 | 0.5450 | 0.7401 |
75
+ | 0.5165 | 1.1117 | 2200 | 0.4611 | 0.7988 |
76
+ | 0.4066 | 1.1622 | 2300 | 0.3226 | 0.8725 |
77
+ | 0.5085 | 1.2127 | 2400 | 0.5858 | 0.7762 |
78
+ | 0.4814 | 1.2633 | 2500 | 0.3981 | 0.7766 |
79
+ | 0.4554 | 1.3138 | 2600 | 0.5076 | 0.7816 |
80
+ | 0.2816 | 1.3643 | 2700 | 0.4732 | 0.8127 |
81
+ | 0.2516 | 1.4149 | 2800 | 0.4315 | 0.8074 |
82
+ | 0.2903 | 1.4654 | 2900 | 0.3845 | 0.8557 |
83
+ | 0.3493 | 1.5159 | 3000 | 0.4921 | 0.7977 |
84
+ | 0.4251 | 1.5664 | 3100 | 0.3855 | 0.8231 |
85
+ | 0.3356 | 1.6170 | 3200 | 0.4012 | 0.8328 |
86
+ | 0.3597 | 1.6675 | 3300 | 0.3308 | 0.8496 |
87
+ | 0.257 | 1.7180 | 3400 | 0.4104 | 0.8138 |
88
+ | 0.3709 | 1.7686 | 3500 | 0.2769 | 0.8879 |
89
+ | 0.3393 | 1.8191 | 3600 | 0.3412 | 0.8643 |
90
+ | 0.4151 | 1.8696 | 3700 | 0.3078 | 0.8747 |
91
+ | 0.3043 | 1.9202 | 3800 | 0.3424 | 0.8650 |
92
+ | 0.3302 | 1.9707 | 3900 | 0.3513 | 0.8335 |
93
+ | 0.4033 | 2.0212 | 4000 | 0.3371 | 0.8511 |
94
+ | 0.3386 | 2.0718 | 4100 | 0.3402 | 0.8396 |
95
+ | 0.3661 | 2.1223 | 4200 | 0.3277 | 0.8561 |
96
+ | 0.2914 | 2.1728 | 4300 | 0.3065 | 0.8650 |
97
+ | 0.4444 | 2.2233 | 4400 | 0.3207 | 0.8493 |
98
+ | 0.2922 | 2.2739 | 4500 | 0.2968 | 0.8686 |
99
+ | 0.3464 | 2.3244 | 4600 | 0.4151 | 0.8070 |
100
+ | 0.2684 | 2.3749 | 4700 | 0.3810 | 0.8385 |
101
+ | 0.3779 | 2.4255 | 4800 | 0.3368 | 0.8514 |
102
+ | 0.4462 | 2.4760 | 4900 | 0.2677 | 0.8965 |
103
+ | 0.3766 | 2.5265 | 5000 | 0.3732 | 0.8439 |
104
+ | 0.4971 | 2.5771 | 5100 | 0.3266 | 0.8618 |
105
+ | 0.3795 | 2.6276 | 5200 | 0.3380 | 0.8607 |
106
+ | 0.4205 | 2.6781 | 5300 | 0.3436 | 0.8618 |
107
+ | 0.3652 | 2.7287 | 5400 | 0.3483 | 0.8518 |
108
+ | 0.3999 | 2.7792 | 5500 | 0.2603 | 0.8908 |
109
+ | 0.2909 | 2.8297 | 5600 | 0.3080 | 0.8693 |
110
+ | 0.3703 | 2.8802 | 5700 | 0.2950 | 0.8808 |
111
+ | 0.4048 | 2.9308 | 5800 | 0.3191 | 0.8500 |
112
+ | 0.3333 | 2.9813 | 5900 | 0.3773 | 0.8443 |
113
+ | 0.2917 | 3.0318 | 6000 | 0.3731 | 0.8432 |
114
+ | 0.4204 | 3.0824 | 6100 | 0.3783 | 0.8528 |
115
+ | 0.3832 | 3.1329 | 6200 | 0.3009 | 0.8693 |
116
+ | 0.32 | 3.1834 | 6300 | 0.3690 | 0.8367 |
117
+ | 0.3761 | 3.2340 | 6400 | 0.3398 | 0.8392 |
118
+ | 0.4041 | 3.2845 | 6500 | 0.2726 | 0.8761 |
119
+ | 0.3373 | 3.3350 | 6600 | 0.3735 | 0.8285 |
120
+ | 0.2869 | 3.3855 | 6700 | 0.2326 | 0.8987 |
121
+ | 0.3381 | 3.4361 | 6800 | 0.2562 | 0.8933 |
122
+ | 0.2193 | 3.4866 | 6900 | 0.2605 | 0.8912 |
123
+ | 0.2685 | 3.5371 | 7000 | 0.2592 | 0.8822 |
124
+ | 0.2867 | 3.5877 | 7100 | 0.3182 | 0.8636 |
125
+ | 0.318 | 3.6382 | 7200 | 0.2988 | 0.8743 |
126
+ | 0.3088 | 3.6887 | 7300 | 0.2870 | 0.8768 |
127
+ | 0.3531 | 3.7393 | 7400 | 0.2924 | 0.8697 |
128
+ | 0.2605 | 3.7898 | 7500 | 0.2942 | 0.8704 |
129
+ | 0.419 | 3.8403 | 7600 | 0.3634 | 0.8485 |
130
+ | 0.264 | 3.8909 | 7700 | 0.2996 | 0.8629 |
131
+ | 0.2349 | 3.9414 | 7800 | 0.2417 | 0.8937 |
132
+ | 0.2726 | 3.9919 | 7900 | 0.3228 | 0.8518 |
133
+ | 0.3398 | 4.0424 | 8000 | 0.2684 | 0.8897 |
134
+ | 0.1933 | 4.0930 | 8100 | 0.2657 | 0.8919 |
135
+ | 0.435 | 4.1435 | 8200 | 0.2455 | 0.8972 |
136
+ | 0.2373 | 4.1940 | 8300 | 0.2929 | 0.8690 |
137
+ | 0.3151 | 4.2446 | 8400 | 0.2745 | 0.8761 |
138
+ | 0.2258 | 4.2951 | 8500 | 0.2486 | 0.8922 |
139
+ | 0.2592 | 4.3456 | 8600 | 0.2696 | 0.8801 |
140
+ | 0.2301 | 4.3962 | 8700 | 0.2719 | 0.8811 |
141
+ | 0.1388 | 4.4467 | 8800 | 0.2617 | 0.8879 |
142
+ | 0.3242 | 4.4972 | 8900 | 0.2543 | 0.8915 |
143
+ | 0.1693 | 4.5478 | 9000 | 0.2602 | 0.8879 |
144
 
145
 
146
  ### Framework versions
147
 
148
  - Transformers 4.44.2
149
  - Pytorch 2.4.1+cu121
 
150
  - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2ea877a6b7f6eda3773b42ff1dd9ba49292e95c4fef1339a50afaee39b996af1
3
  size 343223968
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3aa159d2ec0a58ac1d8b1be8a0ed08a8645bf658c4f793a0c8c4381420b28845
3
  size 343223968
runs/Oct01_19-23-59_14a7e8fdf710/events.out.tfevents.1727810641.14a7e8fdf710.748.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c73fc79a1a8295065b2c79b417839322a27a2ad97a63d30e13bd96006b97ef62
3
+ size 224144
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:dff082020ef725b95702987c72700717daa69dc65c543eca57e94965c646aa58
3
  size 5176
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0f8c9a00a8470ced0592c9a87afdf59dc201072b42d40415e89caf841c7dc7f7
3
  size 5176