Edit model card

swin-tiny-patch4-window7-224-finetuned-swin-tiny

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5222
  • Accuracy: 0.5559

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
3.5958 0.96 20 3.5209 0.0937
3.2466 1.98 41 2.9994 0.2387
2.4246 2.99 62 2.0341 0.4169
1.8599 4.0 83 1.6747 0.4955
1.531 4.96 103 1.5218 0.4773
1.3292 5.98 124 1.3834 0.5317
1.2063 6.99 145 1.3381 0.5468
1.0806 8.0 166 1.2748 0.5710
0.9638 8.96 186 1.3062 0.5559
0.8441 9.98 207 1.3322 0.5498
0.7868 10.99 228 1.2873 0.5710
0.7485 12.0 249 1.2012 0.5619
0.6522 12.96 269 1.2264 0.5861
0.6362 13.98 290 1.2796 0.5589
0.6214 14.99 311 1.3406 0.5529
0.5793 16.0 332 1.2479 0.5740
0.5187 16.96 352 1.3203 0.5891
0.4965 17.98 373 1.3429 0.5619
0.4809 18.99 394 1.3453 0.5831
0.4243 20.0 415 1.3759 0.5498
0.4447 20.96 435 1.4275 0.5196
0.3839 21.98 456 1.4660 0.5589
0.414 22.99 477 1.4465 0.5408
0.3741 24.0 498 1.3944 0.5650
0.3802 24.96 518 1.4272 0.5650
0.3733 25.98 539 1.3341 0.5589
0.3558 26.99 560 1.3864 0.5589
0.3448 28.0 581 1.4027 0.5589
0.3373 28.96 601 1.4452 0.5589
0.311 29.98 622 1.4021 0.5740
0.3218 30.99 643 1.4015 0.5680
0.3082 32.0 664 1.4159 0.5619
0.3173 32.96 684 1.4290 0.5498
0.2551 33.98 705 1.4268 0.5619
0.2739 34.99 726 1.4546 0.5559
0.2533 36.0 747 1.4398 0.5498
0.2578 36.96 767 1.4487 0.5438
0.2472 37.98 788 1.4438 0.5559
0.281 38.99 809 1.4916 0.5529
0.2757 40.0 830 1.4758 0.5619
0.2679 40.96 850 1.5104 0.5559
0.2548 41.98 871 1.5024 0.5529
0.2357 42.99 892 1.5286 0.5468
0.2357 44.0 913 1.5150 0.5529
0.2287 44.96 933 1.5234 0.5589
0.2329 45.98 954 1.5334 0.5650
0.2131 46.99 975 1.5296 0.5619
0.2269 48.0 996 1.5221 0.5559
0.2161 48.19 1000 1.5222 0.5559

Framework versions

  • Transformers 4.37.0
  • Pytorch 2.1.2
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
5
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for huyentls1114/swin-tiny-patch4-window7-224-finetuned-swin-tiny

Finetuned
(473)
this model