|
--- |
|
license: apache-2.0 |
|
base_model: microsoft/swinv2-tiny-patch4-window8-256 |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: egujr001-swim2-base-model |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# egujr001-swim2-base-model |
|
|
|
This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window8-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window8-256) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.1969 |
|
- Accuracy: 0.9457 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 64 |
|
- eval_batch_size: 64 |
|
- seed: 56 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 3 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:| |
|
| 0.6254 | 0.04 | 100 | 0.5840 | 0.7050 | |
|
| 0.3998 | 0.08 | 200 | 0.3525 | 0.8507 | |
|
| 0.2796 | 0.12 | 300 | 0.2710 | 0.8975 | |
|
| 0.23 | 0.15 | 400 | 0.2660 | 0.9012 | |
|
| 0.2372 | 0.19 | 500 | 0.1678 | 0.9401 | |
|
| 0.1944 | 0.23 | 600 | 0.1437 | 0.9437 | |
|
| 0.1635 | 0.27 | 700 | 0.1231 | 0.9503 | |
|
| 0.1463 | 0.31 | 800 | 0.1353 | 0.9551 | |
|
| 0.1287 | 0.35 | 900 | 0.1216 | 0.9523 | |
|
| 0.1208 | 0.39 | 1000 | 0.1695 | 0.9351 | |
|
| 0.1204 | 0.42 | 1100 | 0.1221 | 0.9557 | |
|
| 0.1064 | 0.46 | 1200 | 0.1605 | 0.9432 | |
|
| 0.1114 | 0.5 | 1300 | 0.0998 | 0.9613 | |
|
| 0.1324 | 0.54 | 1400 | 0.0888 | 0.9650 | |
|
| 0.0997 | 0.58 | 1500 | 0.0810 | 0.9686 | |
|
| 0.0904 | 0.62 | 1600 | 0.0945 | 0.9655 | |
|
| 0.0975 | 0.66 | 1700 | 0.0978 | 0.9635 | |
|
| 0.0859 | 0.69 | 1800 | 0.0858 | 0.9696 | |
|
| 0.0785 | 0.73 | 1900 | 0.0749 | 0.9722 | |
|
| 0.0743 | 0.77 | 2000 | 0.0763 | 0.9727 | |
|
| 0.0815 | 0.81 | 2100 | 0.0765 | 0.9728 | |
|
| 0.0674 | 0.85 | 2200 | 0.0881 | 0.9703 | |
|
| 0.0726 | 0.89 | 2300 | 0.0875 | 0.9716 | |
|
| 0.0633 | 0.93 | 2400 | 0.0912 | 0.9721 | |
|
| 0.0501 | 0.96 | 2500 | 0.0743 | 0.9750 | |
|
| 0.0927 | 1.0 | 2600 | 0.0695 | 0.9759 | |
|
| 0.0766 | 1.04 | 2700 | 0.0788 | 0.9733 | |
|
| 0.0934 | 1.08 | 2800 | 0.0699 | 0.9753 | |
|
| 0.0714 | 1.12 | 2900 | 0.0756 | 0.9762 | |
|
| 0.069 | 1.16 | 3000 | 0.0859 | 0.9706 | |
|
| 0.0702 | 1.2 | 3100 | 0.1001 | 0.9658 | |
|
| 0.0633 | 1.23 | 3200 | 0.0724 | 0.9756 | |
|
| 0.0756 | 1.27 | 3300 | 0.0734 | 0.9745 | |
|
| 0.0617 | 1.31 | 3400 | 0.0704 | 0.9747 | |
|
| 0.0498 | 1.35 | 3500 | 0.0651 | 0.9788 | |
|
| 0.0668 | 1.39 | 3600 | 0.0625 | 0.9791 | |
|
| 0.0441 | 1.43 | 3700 | 0.0714 | 0.9774 | |
|
| 0.0789 | 1.46 | 3800 | 0.0880 | 0.9722 | |
|
| 0.0464 | 1.5 | 3900 | 0.0720 | 0.9749 | |
|
| 0.0532 | 1.54 | 4000 | 0.0681 | 0.9782 | |
|
| 0.0677 | 1.58 | 4100 | 0.0733 | 0.9736 | |
|
| 0.0654 | 1.62 | 4200 | 0.0610 | 0.9802 | |
|
| 0.0554 | 1.66 | 4300 | 0.0825 | 0.9740 | |
|
| 0.0836 | 1.7 | 4400 | 0.0694 | 0.9780 | |
|
| 0.0688 | 1.73 | 4500 | 0.0599 | 0.9813 | |
|
| 0.052 | 1.77 | 4600 | 0.0932 | 0.9673 | |
|
| 0.0515 | 1.81 | 4700 | 0.0785 | 0.9759 | |
|
| 0.0586 | 1.85 | 4800 | 0.0660 | 0.9787 | |
|
| 0.056 | 1.89 | 4900 | 0.0612 | 0.9783 | |
|
| 0.037 | 1.93 | 5000 | 0.0645 | 0.9795 | |
|
| 0.0541 | 1.97 | 5100 | 0.0600 | 0.9809 | |
|
| 0.0521 | 2.0 | 5200 | 0.0876 | 0.9737 | |
|
| 0.0352 | 2.04 | 5300 | 0.0709 | 0.9780 | |
|
| 0.0498 | 2.08 | 5400 | 0.0610 | 0.9809 | |
|
| 0.0424 | 2.12 | 5500 | 0.0569 | 0.9830 | |
|
| 0.0532 | 2.16 | 5600 | 0.0625 | 0.9820 | |
|
| 0.046 | 2.2 | 5700 | 0.0512 | 0.9842 | |
|
| 0.0453 | 2.24 | 5800 | 0.0608 | 0.9813 | |
|
| 0.0577 | 2.27 | 5900 | 0.0697 | 0.9811 | |
|
| 0.0397 | 2.31 | 6000 | 0.0688 | 0.9816 | |
|
| 0.0494 | 2.35 | 6100 | 0.0534 | 0.9834 | |
|
| 0.0158 | 2.39 | 6200 | 0.0860 | 0.9774 | |
|
| 0.0297 | 2.43 | 6300 | 0.0593 | 0.9836 | |
|
| 0.055 | 2.47 | 6400 | 0.0579 | 0.9821 | |
|
| 0.0368 | 2.51 | 6500 | 0.0729 | 0.9796 | |
|
| 0.0754 | 2.54 | 6600 | 0.0601 | 0.9827 | |
|
| 0.0523 | 2.58 | 6700 | 0.0597 | 0.9824 | |
|
| 0.0433 | 2.62 | 6800 | 0.0547 | 0.9841 | |
|
| 0.0164 | 2.66 | 6900 | 0.0620 | 0.9827 | |
|
| 0.015 | 2.7 | 7000 | 0.0639 | 0.9822 | |
|
| 0.0415 | 2.74 | 7100 | 0.0576 | 0.9837 | |
|
| 0.0257 | 2.78 | 7200 | 0.0620 | 0.9820 | |
|
| 0.0268 | 2.81 | 7300 | 0.0568 | 0.9837 | |
|
| 0.043 | 2.85 | 7400 | 0.0558 | 0.9836 | |
|
| 0.0339 | 2.89 | 7500 | 0.0554 | 0.9839 | |
|
| 0.0263 | 2.93 | 7600 | 0.0552 | 0.9837 | |
|
| 0.0428 | 2.97 | 7700 | 0.0535 | 0.9842 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.0 |
|
- Pytorch 2.0.0+cu117 |
|
- Datasets 2.19.1 |
|
- Tokenizers 0.13.3 |
|
|