Model save
Browse files- README.md +27 -109
- model.safetensors +1 -1
- runs/Oct01_10-26-33_a59caee1d103/events.out.tfevents.1727778405.a59caee1d103.4704.0 +3 -0
- runs/Oct01_10-27-03_a59caee1d103/events.out.tfevents.1727778433.a59caee1d103.4704.1 +3 -0
- runs/Oct01_10-28-36_a59caee1d103/events.out.tfevents.1727778524.a59caee1d103.4704.2 +3 -0
- runs/Oct01_10-37-37_a59caee1d103/events.out.tfevents.1727779068.a59caee1d103.4704.3 +3 -0
- runs/Oct01_10-42-26_a59caee1d103/events.out.tfevents.1727779356.a59caee1d103.4704.4 +3 -0
- training_args.bin +1 -1
README.md
CHANGED
@@ -3,13 +3,27 @@ library_name: transformers
|
|
3 |
license: apache-2.0
|
4 |
base_model: google/vit-base-patch16-224-in21k
|
5 |
tags:
|
6 |
-
- image-classification
|
7 |
- generated_from_trainer
|
|
|
|
|
8 |
metrics:
|
9 |
- accuracy
|
10 |
model-index:
|
11 |
- name: finetuned-fake-food
|
12 |
-
results:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
---
|
14 |
|
15 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
@@ -17,10 +31,10 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
# finetuned-fake-food
|
19 |
|
20 |
-
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the
|
21 |
It achieves the following results on the evaluation set:
|
22 |
-
- Loss: 0.
|
23 |
-
- Accuracy: 0.
|
24 |
|
25 |
## Model description
|
26 |
|
@@ -40,122 +54,26 @@ More information needed
|
|
40 |
|
41 |
The following hyperparameters were used during training:
|
42 |
- learning_rate: 0.0002
|
43 |
-
- train_batch_size:
|
44 |
- eval_batch_size: 8
|
45 |
- seed: 42
|
46 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
47 |
- lr_scheduler_type: linear
|
48 |
-
- num_epochs:
|
49 |
- mixed_precision_training: Native AMP
|
50 |
|
51 |
### Training results
|
52 |
|
53 |
-
| Training Loss | Epoch
|
54 |
-
|
55 |
-
| 0.
|
56 |
-
| 0.
|
57 |
-
| 0.
|
58 |
-
| 0.214 | 0.4040 | 400 | 0.2690 | 0.8901 |
|
59 |
-
| 0.1947 | 0.5051 | 500 | 0.2533 | 0.9112 |
|
60 |
-
| 0.3618 | 0.6061 | 600 | 0.3738 | 0.8571 |
|
61 |
-
| 0.2065 | 0.7071 | 700 | 0.2919 | 0.8919 |
|
62 |
-
| 0.3103 | 0.8081 | 800 | 0.2165 | 0.9169 |
|
63 |
-
| 0.1479 | 0.9091 | 900 | 0.2135 | 0.9173 |
|
64 |
-
| 0.2421 | 1.0101 | 1000 | 0.2187 | 0.9184 |
|
65 |
-
| 0.2264 | 1.1111 | 1100 | 0.1888 | 0.9205 |
|
66 |
-
| 0.1664 | 1.2121 | 1200 | 0.2607 | 0.8876 |
|
67 |
-
| 0.2049 | 1.3131 | 1300 | 0.2502 | 0.9005 |
|
68 |
-
| 0.1503 | 1.4141 | 1400 | 0.2305 | 0.9177 |
|
69 |
-
| 0.1846 | 1.5152 | 1500 | 0.1881 | 0.9219 |
|
70 |
-
| 0.1571 | 1.6162 | 1600 | 0.1788 | 0.9284 |
|
71 |
-
| 0.4091 | 1.7172 | 1700 | 0.2228 | 0.9216 |
|
72 |
-
| 0.2954 | 1.8182 | 1800 | 0.1653 | 0.9366 |
|
73 |
-
| 0.1366 | 1.9192 | 1900 | 0.1529 | 0.9420 |
|
74 |
-
| 0.1657 | 2.0202 | 2000 | 0.1745 | 0.9255 |
|
75 |
-
| 0.2531 | 2.1212 | 2100 | 0.1744 | 0.9381 |
|
76 |
-
| 0.152 | 2.2222 | 2200 | 0.2513 | 0.8951 |
|
77 |
-
| 0.145 | 2.3232 | 2300 | 0.1718 | 0.9302 |
|
78 |
-
| 0.202 | 2.4242 | 2400 | 0.2436 | 0.9033 |
|
79 |
-
| 0.1346 | 2.5253 | 2500 | 0.1839 | 0.9234 |
|
80 |
-
| 0.1554 | 2.6263 | 2600 | 0.1447 | 0.9463 |
|
81 |
-
| 0.183 | 2.7273 | 2700 | 0.2474 | 0.8822 |
|
82 |
-
| 0.0972 | 2.8283 | 2800 | 0.2223 | 0.9205 |
|
83 |
-
| 0.1073 | 2.9293 | 2900 | 0.1860 | 0.9345 |
|
84 |
-
| 0.1824 | 3.0303 | 3000 | 0.2324 | 0.9194 |
|
85 |
-
| 0.1221 | 3.1313 | 3100 | 0.1475 | 0.9449 |
|
86 |
-
| 0.1039 | 3.2323 | 3200 | 0.1480 | 0.9427 |
|
87 |
-
| 0.276 | 3.3333 | 3300 | 0.1591 | 0.9402 |
|
88 |
-
| 0.2498 | 3.4343 | 3400 | 0.2447 | 0.9098 |
|
89 |
-
| 0.1453 | 3.5354 | 3500 | 0.1556 | 0.9416 |
|
90 |
-
| 0.1794 | 3.6364 | 3600 | 0.2272 | 0.9083 |
|
91 |
-
| 0.1467 | 3.7374 | 3700 | 0.1673 | 0.9413 |
|
92 |
-
| 0.1372 | 3.8384 | 3800 | 0.1763 | 0.9341 |
|
93 |
-
| 0.2283 | 3.9394 | 3900 | 0.1671 | 0.9373 |
|
94 |
-
| 0.164 | 4.0404 | 4000 | 0.1490 | 0.9477 |
|
95 |
-
| 0.1513 | 4.1414 | 4100 | 0.1547 | 0.9488 |
|
96 |
-
| 0.0991 | 4.2424 | 4200 | 0.1536 | 0.9431 |
|
97 |
-
| 0.1419 | 4.3434 | 4300 | 0.1568 | 0.9445 |
|
98 |
-
| 0.1452 | 4.4444 | 4400 | 0.2328 | 0.9320 |
|
99 |
-
| 0.1445 | 4.5455 | 4500 | 0.1351 | 0.9513 |
|
100 |
-
| 0.1366 | 4.6465 | 4600 | 0.1571 | 0.9416 |
|
101 |
-
| 0.097 | 4.7475 | 4700 | 0.1506 | 0.9424 |
|
102 |
-
| 0.0603 | 4.8485 | 4800 | 0.1435 | 0.9499 |
|
103 |
-
| 0.1179 | 4.9495 | 4900 | 0.1754 | 0.9363 |
|
104 |
-
| 0.1948 | 5.0505 | 5000 | 0.1609 | 0.9402 |
|
105 |
-
| 0.1021 | 5.1515 | 5100 | 0.1566 | 0.9459 |
|
106 |
-
| 0.0652 | 5.2525 | 5200 | 0.1564 | 0.9481 |
|
107 |
-
| 0.1029 | 5.3535 | 5300 | 0.1410 | 0.9492 |
|
108 |
-
| 0.1014 | 5.4545 | 5400 | 0.1490 | 0.9531 |
|
109 |
-
| 0.1338 | 5.5556 | 5500 | 0.1865 | 0.9406 |
|
110 |
-
| 0.0844 | 5.6566 | 5600 | 0.1631 | 0.9456 |
|
111 |
-
| 0.1059 | 5.7576 | 5700 | 0.1738 | 0.9409 |
|
112 |
-
| 0.0788 | 5.8586 | 5800 | 0.1801 | 0.9370 |
|
113 |
-
| 0.0941 | 5.9596 | 5900 | 0.1575 | 0.9495 |
|
114 |
-
| 0.112 | 6.0606 | 6000 | 0.1796 | 0.9470 |
|
115 |
-
| 0.0691 | 6.1616 | 6100 | 0.1697 | 0.9499 |
|
116 |
-
| 0.1385 | 6.2626 | 6200 | 0.1348 | 0.9563 |
|
117 |
-
| 0.1173 | 6.3636 | 6300 | 0.1522 | 0.9502 |
|
118 |
-
| 0.046 | 6.4646 | 6400 | 0.2114 | 0.9391 |
|
119 |
-
| 0.0319 | 6.5657 | 6500 | 0.1723 | 0.9477 |
|
120 |
-
| 0.0757 | 6.6667 | 6600 | 0.1561 | 0.9527 |
|
121 |
-
| 0.0744 | 6.7677 | 6700 | 0.1587 | 0.9567 |
|
122 |
-
| 0.0341 | 6.8687 | 6800 | 0.1458 | 0.9578 |
|
123 |
-
| 0.1512 | 6.9697 | 6900 | 0.1572 | 0.9531 |
|
124 |
-
| 0.0153 | 7.0707 | 7000 | 0.1402 | 0.9617 |
|
125 |
-
| 0.0711 | 7.1717 | 7100 | 0.1527 | 0.9610 |
|
126 |
-
| 0.0453 | 7.2727 | 7200 | 0.1512 | 0.9570 |
|
127 |
-
| 0.0052 | 7.3737 | 7300 | 0.1936 | 0.9520 |
|
128 |
-
| 0.0477 | 7.4747 | 7400 | 0.1699 | 0.9513 |
|
129 |
-
| 0.091 | 7.5758 | 7500 | 0.1628 | 0.9513 |
|
130 |
-
| 0.063 | 7.6768 | 7600 | 0.1474 | 0.9578 |
|
131 |
-
| 0.0497 | 7.7778 | 7700 | 0.1389 | 0.9613 |
|
132 |
-
| 0.0552 | 7.8788 | 7800 | 0.2587 | 0.9381 |
|
133 |
-
| 0.0364 | 7.9798 | 7900 | 0.1361 | 0.9603 |
|
134 |
-
| 0.0124 | 8.0808 | 8000 | 0.1438 | 0.9606 |
|
135 |
-
| 0.0703 | 8.1818 | 8100 | 0.1577 | 0.9585 |
|
136 |
-
| 0.025 | 8.2828 | 8200 | 0.1943 | 0.9484 |
|
137 |
-
| 0.0259 | 8.3838 | 8300 | 0.1590 | 0.9613 |
|
138 |
-
| 0.0049 | 8.4848 | 8400 | 0.1521 | 0.9581 |
|
139 |
-
| 0.0174 | 8.5859 | 8500 | 0.1522 | 0.9599 |
|
140 |
-
| 0.0194 | 8.6869 | 8600 | 0.1456 | 0.9606 |
|
141 |
-
| 0.0315 | 8.7879 | 8700 | 0.1411 | 0.9599 |
|
142 |
-
| 0.0419 | 8.8889 | 8800 | 0.1426 | 0.9592 |
|
143 |
-
| 0.0193 | 8.9899 | 8900 | 0.1375 | 0.9642 |
|
144 |
-
| 0.0027 | 9.0909 | 9000 | 0.1379 | 0.9635 |
|
145 |
-
| 0.0345 | 9.1919 | 9100 | 0.1444 | 0.9631 |
|
146 |
-
| 0.0291 | 9.2929 | 9200 | 0.1492 | 0.9624 |
|
147 |
-
| 0.017 | 9.3939 | 9300 | 0.1466 | 0.9635 |
|
148 |
-
| 0.0269 | 9.4949 | 9400 | 0.1523 | 0.9631 |
|
149 |
-
| 0.003 | 9.5960 | 9500 | 0.1445 | 0.9628 |
|
150 |
-
| 0.0471 | 9.6970 | 9600 | 0.1454 | 0.9617 |
|
151 |
-
| 0.0356 | 9.7980 | 9700 | 0.1452 | 0.9620 |
|
152 |
-
| 0.0034 | 9.8990 | 9800 | 0.1445 | 0.9624 |
|
153 |
-
| 0.0162 | 10.0 | 9900 | 0.1451 | 0.9628 |
|
154 |
|
155 |
|
156 |
### Framework versions
|
157 |
|
158 |
- Transformers 4.44.2
|
159 |
- Pytorch 2.4.1+cu121
|
160 |
-
- Datasets 3.0.
|
161 |
- Tokenizers 0.19.1
|
|
|
3 |
license: apache-2.0
|
4 |
base_model: google/vit-base-patch16-224-in21k
|
5 |
tags:
|
|
|
6 |
- generated_from_trainer
|
7 |
+
datasets:
|
8 |
+
- imagefolder
|
9 |
metrics:
|
10 |
- accuracy
|
11 |
model-index:
|
12 |
- name: finetuned-fake-food
|
13 |
+
results:
|
14 |
+
- task:
|
15 |
+
name: Image Classification
|
16 |
+
type: image-classification
|
17 |
+
dataset:
|
18 |
+
name: imagefolder
|
19 |
+
type: imagefolder
|
20 |
+
config: default
|
21 |
+
split: train
|
22 |
+
args: default
|
23 |
+
metrics:
|
24 |
+
- name: Accuracy
|
25 |
+
type: accuracy
|
26 |
+
value: 0.9523809523809523
|
27 |
---
|
28 |
|
29 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
31 |
|
32 |
# finetuned-fake-food
|
33 |
|
34 |
+
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
|
35 |
It achieves the following results on the evaluation set:
|
36 |
+
- Loss: 0.2076
|
37 |
+
- Accuracy: 0.9524
|
38 |
|
39 |
## Model description
|
40 |
|
|
|
54 |
|
55 |
The following hyperparameters were used during training:
|
56 |
- learning_rate: 0.0002
|
57 |
+
- train_batch_size: 8
|
58 |
- eval_batch_size: 8
|
59 |
- seed: 42
|
60 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
61 |
- lr_scheduler_type: linear
|
62 |
+
- num_epochs: 20
|
63 |
- mixed_precision_training: Native AMP
|
64 |
|
65 |
### Training results
|
66 |
|
67 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
68 |
+
|:-------------:|:-------:|:----:|:---------------:|:--------:|
|
69 |
+
| 0.328 | 6.6667 | 100 | 0.3854 | 0.8571 |
|
70 |
+
| 0.1729 | 13.3333 | 200 | 0.1446 | 0.9524 |
|
71 |
+
| 0.0508 | 20.0 | 300 | 0.2076 | 0.9524 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
72 |
|
73 |
|
74 |
### Framework versions
|
75 |
|
76 |
- Transformers 4.44.2
|
77 |
- Pytorch 2.4.1+cu121
|
78 |
+
- Datasets 3.0.1
|
79 |
- Tokenizers 0.19.1
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 343223968
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:57a0f42eb555cf7f84b8068d7a5570c26bb79050ac4b62a86c16f177804068bc
|
3 |
size 343223968
|
runs/Oct01_10-26-33_a59caee1d103/events.out.tfevents.1727778405.a59caee1d103.4704.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eb777a4c786af5790113439986d60b2689d1bf21618e8bd17d204820e00498bc
|
3 |
+
size 4872
|
runs/Oct01_10-27-03_a59caee1d103/events.out.tfevents.1727778433.a59caee1d103.4704.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:dd2d23462d09d2ce67f4b073155d0e2435e043fa84e67238b805d5f113d52849
|
3 |
+
size 4922
|
runs/Oct01_10-28-36_a59caee1d103/events.out.tfevents.1727778524.a59caee1d103.4704.2
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0af3742776c8c211bb89933259995081b03a972a80d33760d0f0384416dd02e8
|
3 |
+
size 5397
|
runs/Oct01_10-37-37_a59caee1d103/events.out.tfevents.1727779068.a59caee1d103.4704.3
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5ceb2ff7184d21fd36f61494604284ebdfd585e788326aa03247713aeb045f75
|
3 |
+
size 4873
|
runs/Oct01_10-42-26_a59caee1d103/events.out.tfevents.1727779356.a59caee1d103.4704.4
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:98bb5446fa48a58953318608717be01fd001607632ef608446c1c1024897972d
|
3 |
+
size 6819
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 5176
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:491f5ee4f5d28aebb4c244d7745661d13229ae30aa917a57c1397ff9a0e96751
|
3 |
size 5176
|