swin-transformer2 / README.md
masafresh's picture
Model save
ea24105 verified
metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/swin-large-patch4-window12-384
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: swin-transformer2
    results: []

swin-transformer2

This model is a fine-tuned version of microsoft/swin-large-patch4-window12-384 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2129
  • Accuracy: 0.6386
  • F1: 0.6328

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.6336 0.9840 46 1.6510 0.2530 0.1876
1.2894 1.9893 93 1.2218 0.4458 0.3780
1.0959 2.9947 140 1.1383 0.5060 0.3518
1.0467 4.0 187 0.9372 0.5542 0.4352
0.9879 4.9840 233 1.0139 0.5301 0.4718
0.9086 5.9893 280 0.8822 0.6627 0.6359
0.9776 6.9947 327 1.0269 0.5542 0.5139
0.9715 8.0 374 0.7964 0.5663 0.5588
0.9049 8.9840 420 0.7839 0.5904 0.5346
0.8697 9.9893 467 1.0379 0.5663 0.4921
0.882 10.9947 514 0.9132 0.5663 0.5379
0.832 12.0 561 0.8513 0.5783 0.5008
0.7475 12.9840 607 0.7612 0.6627 0.6427
0.9056 13.9893 654 0.8431 0.6145 0.5725
0.9978 14.9947 701 0.7221 0.7108 0.6983
0.6956 16.0 748 0.7545 0.6145 0.5888
0.7185 16.9840 794 0.6561 0.6627 0.6499
0.8139 17.9893 841 0.7512 0.6506 0.6386
0.6837 18.9947 888 0.6491 0.6988 0.6849
0.5191 20.0 935 0.7290 0.6386 0.6336
0.6538 20.9840 981 0.8000 0.6988 0.6621
0.7912 21.9893 1028 1.0183 0.6145 0.5824
0.6093 22.9947 1075 0.9124 0.6506 0.6396
0.5312 24.0 1122 0.9098 0.6024 0.5581
0.6654 24.9840 1168 1.0432 0.5422 0.5028
0.5798 25.9893 1215 0.7369 0.6627 0.6553
0.506 26.9947 1262 0.9057 0.6265 0.6236
0.4638 28.0 1309 0.7950 0.6867 0.6644
0.371 28.9840 1355 1.0368 0.6627 0.6473
0.4721 29.9893 1402 0.8129 0.6747 0.6673
0.54 30.9947 1449 1.0379 0.6627 0.6491
0.3978 32.0 1496 1.3857 0.5904 0.5481
0.3503 32.9840 1542 1.0920 0.6024 0.5847
0.4407 33.9893 1589 1.1912 0.5904 0.5505
0.3786 34.9947 1636 1.5071 0.6024 0.5915
0.3482 36.0 1683 1.1161 0.6386 0.6240
0.2695 36.9840 1729 1.2040 0.5904 0.5704
0.2296 37.9893 1776 1.5781 0.5181 0.4691
0.2922 38.9947 1823 1.3713 0.6024 0.5879
0.1511 40.0 1870 1.1638 0.6506 0.6553
0.2814 40.9840 1916 1.3384 0.6988 0.6939
0.2196 41.9893 1963 1.2872 0.6506 0.6330
0.2477 42.9947 2010 1.5322 0.6627 0.6375
0.3296 44.0 2057 1.3479 0.6506 0.6353
0.2015 44.9840 2103 1.2521 0.6145 0.6044
0.3476 45.9893 2150 1.2464 0.6747 0.6641
0.189 46.9947 2197 1.4480 0.6506 0.6235
0.1852 48.0 2244 1.3611 0.6747 0.6594
0.2798 48.9840 2290 1.4427 0.6988 0.6957
0.1523 49.9893 2337 1.3352 0.6506 0.6450
0.1224 50.9947 2384 1.8088 0.6386 0.6201
0.0926 52.0 2431 1.4695 0.6506 0.6296
0.2071 52.9840 2477 1.4673 0.6867 0.6806
0.1063 53.9893 2524 1.4862 0.7108 0.6975
0.1831 54.9947 2571 1.4666 0.6506 0.6161
0.158 56.0 2618 1.8832 0.6988 0.6673
0.26 56.9840 2664 1.5855 0.6386 0.5986
0.1697 57.9893 2711 1.2184 0.7470 0.7434
0.2024 58.9947 2758 1.3524 0.6867 0.6682
0.2495 60.0 2805 1.7523 0.6627 0.6427
0.1247 60.9840 2851 1.7007 0.6506 0.6372
0.1436 61.9893 2898 1.9171 0.6386 0.6120
0.1438 62.9947 2945 1.8998 0.6265 0.5897
0.1137 64.0 2992 2.4028 0.5904 0.5498
0.1619 64.9840 3038 1.7087 0.7470 0.7473
0.1105 65.9893 3085 1.6545 0.6988 0.6975
0.1597 66.9947 3132 1.8024 0.6747 0.6758
0.0338 68.0 3179 1.8962 0.6747 0.6706
0.1184 68.9840 3225 2.1642 0.7108 0.7102
0.0878 69.9893 3272 2.0974 0.6506 0.6610
0.0963 70.9947 3319 1.8719 0.7108 0.7162
0.0827 72.0 3366 1.7538 0.6988 0.7000
0.0933 72.9840 3412 1.9357 0.6988 0.6988
0.0593 73.9893 3459 1.9924 0.6506 0.6420
0.0423 74.9947 3506 2.2029 0.6627 0.6702
0.0311 76.0 3553 1.9236 0.7108 0.7155
0.1881 76.9840 3599 1.9606 0.6747 0.6787
0.0566 77.9893 3646 2.1122 0.6265 0.6206
0.0266 78.9947 3693 2.1469 0.6506 0.6536
0.1015 80.0 3740 2.0335 0.6506 0.6587
0.1083 80.9840 3786 2.2123 0.6506 0.6509
0.0161 81.9893 3833 2.3094 0.6988 0.7064
0.0194 82.9947 3880 2.3315 0.6145 0.6101
0.113 84.0 3927 2.5276 0.6867 0.6908
0.0653 84.9840 3973 2.0321 0.6265 0.6263
0.0684 85.9893 4020 2.0302 0.6627 0.6706
0.1724 86.9947 4067 2.5865 0.5904 0.5860
0.028 88.0 4114 2.3814 0.5904 0.5804
0.0528 88.9840 4160 2.2804 0.6386 0.6410
0.0341 89.9893 4207 2.0635 0.5783 0.5736
0.0074 90.9947 4254 2.3491 0.6024 0.5993
0.0165 92.0 4301 2.2152 0.6145 0.6036
0.0157 92.9840 4347 2.3380 0.6145 0.6036
0.0544 93.9893 4394 2.3319 0.6265 0.6221
0.0577 94.9947 4441 2.2671 0.6265 0.6221
0.1516 96.0 4488 2.2034 0.6265 0.6204
0.0318 96.9840 4534 2.1932 0.6265 0.6204
0.043 97.9893 4581 2.2178 0.6265 0.6204
0.0099 98.3957 4600 2.2129 0.6386 0.6328

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.1