metadata
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: swin-tiny-patch4-window7-224-finetuned-eurosat
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.7663551401869159
swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.8360
- Accuracy: 0.7664
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.9333 | 7 | 3.8894 | 0.0841 |
3.897 | 2.0 | 15 | 3.8185 | 0.0841 |
3.8553 | 2.9333 | 22 | 3.7402 | 0.0748 |
3.7568 | 4.0 | 30 | 3.6372 | 0.0748 |
3.7568 | 4.9333 | 37 | 3.5482 | 0.0841 |
3.5912 | 6.0 | 45 | 3.4069 | 0.1121 |
3.4342 | 6.9333 | 52 | 3.2939 | 0.1308 |
3.2601 | 8.0 | 60 | 3.1786 | 0.2150 |
3.2601 | 8.9333 | 67 | 3.0323 | 0.2336 |
3.0498 | 10.0 | 75 | 2.8695 | 0.2617 |
2.849 | 10.9333 | 82 | 2.8505 | 0.2523 |
2.6452 | 12.0 | 90 | 2.6319 | 0.2804 |
2.6452 | 12.9333 | 97 | 2.4654 | 0.3271 |
2.4123 | 14.0 | 105 | 2.3995 | 0.3364 |
2.2561 | 14.9333 | 112 | 2.2584 | 0.4019 |
2.0447 | 16.0 | 120 | 2.2000 | 0.4299 |
2.0447 | 16.9333 | 127 | 2.0806 | 0.4393 |
1.8569 | 18.0 | 135 | 2.0593 | 0.4393 |
1.7447 | 18.9333 | 142 | 1.8832 | 0.4673 |
1.5821 | 20.0 | 150 | 1.8218 | 0.5047 |
1.5821 | 20.9333 | 157 | 1.7334 | 0.5421 |
1.3999 | 22.0 | 165 | 1.6213 | 0.5514 |
1.2901 | 22.9333 | 172 | 1.5932 | 0.5234 |
1.1569 | 24.0 | 180 | 1.5256 | 0.5701 |
1.1569 | 24.9333 | 187 | 1.4281 | 0.5888 |
1.0903 | 26.0 | 195 | 1.3997 | 0.5794 |
0.9674 | 26.9333 | 202 | 1.4017 | 0.5888 |
0.98 | 28.0 | 210 | 1.2916 | 0.5981 |
0.98 | 28.9333 | 217 | 1.3018 | 0.5981 |
0.8772 | 30.0 | 225 | 1.2552 | 0.6355 |
0.7842 | 30.9333 | 232 | 1.2372 | 0.6075 |
0.7438 | 32.0 | 240 | 1.1908 | 0.6168 |
0.7438 | 32.9333 | 247 | 1.1567 | 0.6636 |
0.725 | 34.0 | 255 | 1.1542 | 0.6262 |
0.6709 | 34.9333 | 262 | 1.1377 | 0.6262 |
0.6898 | 36.0 | 270 | 1.0524 | 0.6636 |
0.6898 | 36.9333 | 277 | 1.0272 | 0.6729 |
0.6125 | 38.0 | 285 | 1.0399 | 0.6355 |
0.6153 | 38.9333 | 292 | 1.0308 | 0.6822 |
0.5898 | 40.0 | 300 | 1.0151 | 0.7009 |
0.5898 | 40.9333 | 307 | 1.0483 | 0.6542 |
0.5881 | 42.0 | 315 | 0.9926 | 0.7009 |
0.54 | 42.9333 | 322 | 1.0300 | 0.6916 |
0.4515 | 44.0 | 330 | 0.9262 | 0.7383 |
0.4515 | 44.9333 | 337 | 0.9486 | 0.7290 |
0.5057 | 46.0 | 345 | 0.9219 | 0.7103 |
0.4905 | 46.9333 | 352 | 1.0184 | 0.6822 |
0.4669 | 48.0 | 360 | 0.9337 | 0.7290 |
0.4669 | 48.9333 | 367 | 0.9431 | 0.7103 |
0.4437 | 50.0 | 375 | 0.9312 | 0.7009 |
0.4754 | 50.9333 | 382 | 0.9245 | 0.7196 |
0.4119 | 52.0 | 390 | 0.8826 | 0.7383 |
0.4119 | 52.9333 | 397 | 0.9262 | 0.7196 |
0.4087 | 54.0 | 405 | 0.8882 | 0.7477 |
0.3987 | 54.9333 | 412 | 0.9282 | 0.7290 |
0.4253 | 56.0 | 420 | 0.9004 | 0.7477 |
0.4253 | 56.9333 | 427 | 0.8783 | 0.7477 |
0.4134 | 58.0 | 435 | 0.8360 | 0.7664 |
0.4024 | 58.9333 | 442 | 0.9016 | 0.7196 |
0.3688 | 60.0 | 450 | 0.9251 | 0.6822 |
0.3688 | 60.9333 | 457 | 0.9086 | 0.7103 |
0.3833 | 62.0 | 465 | 0.8494 | 0.7383 |
0.3614 | 62.9333 | 472 | 0.8299 | 0.7290 |
0.3792 | 64.0 | 480 | 0.9015 | 0.7383 |
0.3792 | 64.9333 | 487 | 0.8802 | 0.7196 |
0.3632 | 66.0 | 495 | 0.8881 | 0.7009 |
0.3405 | 66.9333 | 502 | 0.8578 | 0.7383 |
0.3673 | 68.0 | 510 | 0.8540 | 0.7570 |
0.3673 | 68.9333 | 517 | 0.8345 | 0.7383 |
0.3379 | 70.0 | 525 | 0.7919 | 0.7383 |
0.3389 | 70.9333 | 532 | 0.8384 | 0.7290 |
0.3363 | 72.0 | 540 | 0.8306 | 0.7383 |
0.3363 | 72.9333 | 547 | 0.8875 | 0.7477 |
0.3494 | 74.0 | 555 | 0.9151 | 0.7009 |
0.2989 | 74.9333 | 562 | 0.8606 | 0.7103 |
0.3157 | 76.0 | 570 | 0.8640 | 0.7383 |
0.3157 | 76.9333 | 577 | 0.8532 | 0.7290 |
0.3013 | 78.0 | 585 | 0.8479 | 0.7103 |
0.2968 | 78.9333 | 592 | 0.8839 | 0.7383 |
0.3013 | 80.0 | 600 | 0.8837 | 0.7196 |
0.3013 | 80.9333 | 607 | 0.8694 | 0.7103 |
0.3247 | 82.0 | 615 | 0.8721 | 0.7290 |
0.2515 | 82.9333 | 622 | 0.8605 | 0.7290 |
0.3175 | 84.0 | 630 | 0.8505 | 0.7290 |
0.3175 | 84.9333 | 637 | 0.8488 | 0.7290 |
0.3015 | 86.0 | 645 | 0.8554 | 0.7383 |
0.2989 | 86.9333 | 652 | 0.8707 | 0.7290 |
0.3155 | 88.0 | 660 | 0.8712 | 0.7290 |
0.3155 | 88.9333 | 667 | 0.8659 | 0.7290 |
0.2871 | 90.0 | 675 | 0.8573 | 0.7290 |
0.2872 | 90.9333 | 682 | 0.8530 | 0.7290 |
0.2587 | 92.0 | 690 | 0.8516 | 0.7383 |
0.2587 | 92.9333 | 697 | 0.8502 | 0.7383 |
0.3133 | 93.3333 | 700 | 0.8501 | 0.7383 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1