swinv2-large-panorama-IQA
This model is a fine-tuned version of microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft on the isiqa-2019-hf dataset. It achieves the following results on the evaluation set:
- Loss: 0.0352
- Srocc: 0.0683
- Lcc: 0.1820
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 32
- seed: 10
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50.0
Training results
Training Loss | Epoch | Step | Validation Loss | Srocc | Lcc |
---|---|---|---|---|---|
No log | 0.8889 | 3 | 0.4401 | -0.1586 | -0.1147 |
No log | 1.7778 | 6 | 0.2206 | -0.2675 | -0.1545 |
0.3084 | 2.9630 | 10 | 0.1910 | -0.2754 | -0.1813 |
0.3084 | 3.8519 | 13 | 0.2334 | -0.2170 | -0.1511 |
0.3084 | 4.7407 | 16 | 0.1484 | -0.2 | -0.1310 |
0.0852 | 5.9259 | 20 | 0.1259 | -0.1021 | -0.0852 |
0.0852 | 6.8148 | 23 | 0.1552 | -0.0709 | -0.0595 |
0.0852 | 8.0 | 27 | 0.0942 | -0.0948 | -0.0584 |
0.0406 | 8.8889 | 30 | 0.0841 | -0.0480 | -0.0550 |
0.0406 | 9.7778 | 33 | 0.0886 | -0.0576 | -0.0448 |
0.0406 | 10.9630 | 37 | 0.0721 | -0.0773 | -0.0474 |
0.023 | 11.8519 | 40 | 0.0697 | -0.0446 | -0.0364 |
0.023 | 12.7407 | 43 | 0.0577 | -0.0217 | -0.0091 |
0.023 | 13.9259 | 47 | 0.0666 | -0.0314 | 0.0112 |
0.0136 | 14.8148 | 50 | 0.0525 | -0.0501 | 0.0060 |
0.0136 | 16.0 | 54 | 0.0626 | -0.0178 | 0.0504 |
0.0136 | 16.8889 | 57 | 0.0438 | 0.0159 | 0.0827 |
0.0113 | 17.7778 | 60 | 0.0503 | 0.0741 | 0.1074 |
0.0113 | 18.9630 | 64 | 0.0429 | 0.0818 | 0.1129 |
0.0113 | 19.8519 | 67 | 0.0455 | 0.0874 | 0.1188 |
0.0097 | 20.7407 | 70 | 0.0597 | 0.0926 | 0.1316 |
0.0097 | 21.9259 | 74 | 0.0397 | 0.0614 | 0.1446 |
0.0097 | 22.8148 | 77 | 0.0529 | 0.0778 | 0.1637 |
0.0084 | 24.0 | 81 | 0.0366 | 0.0716 | 0.1761 |
0.0084 | 24.8889 | 84 | 0.0352 | 0.0683 | 0.1820 |
0.0084 | 25.7778 | 87 | 0.0491 | 0.0970 | 0.1848 |
0.0078 | 26.9630 | 91 | 0.0396 | 0.0984 | 0.1831 |
0.0078 | 27.8519 | 94 | 0.0395 | 0.1012 | 0.1856 |
0.0078 | 28.7407 | 97 | 0.0426 | 0.1097 | 0.1956 |
0.0063 | 29.9259 | 101 | 0.0370 | 0.1002 | 0.1984 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 8