Niraya666's picture
End of training
6f739ef
|
raw
history blame
14.2 kB
metadata
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swin-tiny-patch4-window7-224-finetuned-ADC-4cls-0922
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7

swin-tiny-patch4-window7-224-finetuned-ADC-4cls-0922

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8947
  • Accuracy: 0.7

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 0.9655 0.6714
No log 2.0 4 0.9654 0.6571
No log 3.0 6 0.9651 0.6571
No log 4.0 8 0.9647 0.6571
1.0064 5.0 10 0.9641 0.6571
1.0064 6.0 12 0.9635 0.6571
1.0064 7.0 14 0.9629 0.6571
1.0064 8.0 16 0.9623 0.6571
1.0064 9.0 18 0.9617 0.6571
0.9821 10.0 20 0.9611 0.6571
0.9821 11.0 22 0.9607 0.6571
0.9821 12.0 24 0.9604 0.6714
0.9821 13.0 26 0.9601 0.6714
0.9821 14.0 28 0.9597 0.6714
1.0278 15.0 30 0.9592 0.6714
1.0278 16.0 32 0.9581 0.6714
1.0278 17.0 34 0.9567 0.6714
1.0278 18.0 36 0.9551 0.6714
1.0278 19.0 38 0.9534 0.6714
0.9986 20.0 40 0.9514 0.6571
0.9986 21.0 42 0.9493 0.6571
0.9986 22.0 44 0.9472 0.6429
0.9986 23.0 46 0.9452 0.6429
0.9986 24.0 48 0.9434 0.6429
0.9973 25.0 50 0.9420 0.6429
0.9973 26.0 52 0.9405 0.6429
0.9973 27.0 54 0.9387 0.6286
0.9973 28.0 56 0.9376 0.6286
0.9973 29.0 58 0.9368 0.6429
0.9936 30.0 60 0.9362 0.6429
0.9936 31.0 62 0.9361 0.6571
0.9936 32.0 64 0.9364 0.6714
0.9936 33.0 66 0.9371 0.6714
0.9936 34.0 68 0.9380 0.6429
0.9746 35.0 70 0.9380 0.6571
0.9746 36.0 72 0.9375 0.6714
0.9746 37.0 74 0.9380 0.6714
0.9746 38.0 76 0.9375 0.6714
0.9746 39.0 78 0.9370 0.6714
1.0113 40.0 80 0.9362 0.6714
1.0113 41.0 82 0.9341 0.6714
1.0113 42.0 84 0.9301 0.6857
1.0113 43.0 86 0.9260 0.6714
1.0113 44.0 88 0.9224 0.6571
0.9756 45.0 90 0.9190 0.6714
0.9756 46.0 92 0.9154 0.6714
0.9756 47.0 94 0.9123 0.6714
0.9756 48.0 96 0.9091 0.6571
0.9756 49.0 98 0.9071 0.6571
0.9721 50.0 100 0.9056 0.6571
0.9721 51.0 102 0.9047 0.6571
0.9721 52.0 104 0.9039 0.6571
0.9721 53.0 106 0.9031 0.6714
0.9721 54.0 108 0.9025 0.6714
0.9698 55.0 110 0.9023 0.6714
0.9698 56.0 112 0.9012 0.6714
0.9698 57.0 114 0.8997 0.6714
0.9698 58.0 116 0.8982 0.6714
0.9698 59.0 118 0.8970 0.6714
0.9341 60.0 120 0.8957 0.6857
0.9341 61.0 122 0.8947 0.7
0.9341 62.0 124 0.8940 0.7
0.9341 63.0 126 0.8941 0.6714
0.9341 64.0 128 0.8934 0.6714
0.9717 65.0 130 0.8917 0.6714
0.9717 66.0 132 0.8898 0.6857
0.9717 67.0 134 0.8884 0.6857
0.9717 68.0 136 0.8870 0.6857
0.9717 69.0 138 0.8854 0.6857
0.9655 70.0 140 0.8840 0.6857
0.9655 71.0 142 0.8827 0.6857
0.9655 72.0 144 0.8814 0.6857
0.9655 73.0 146 0.8805 0.6857
0.9655 74.0 148 0.8803 0.6857
0.9458 75.0 150 0.8802 0.6857
0.9458 76.0 152 0.8797 0.6714
0.9458 77.0 154 0.8794 0.6714
0.9458 78.0 156 0.8796 0.6714
0.9458 79.0 158 0.8808 0.6714
0.9094 80.0 160 0.8817 0.6714
0.9094 81.0 162 0.8828 0.6714
0.9094 82.0 164 0.8836 0.6714
0.9094 83.0 166 0.8830 0.6714
0.9094 84.0 168 0.8821 0.6571
0.8719 85.0 170 0.8813 0.6571
0.8719 86.0 172 0.8804 0.6714
0.8719 87.0 174 0.8798 0.6571
0.8719 88.0 176 0.8787 0.6571
0.8719 89.0 178 0.8770 0.6571
0.9288 90.0 180 0.8752 0.6857
0.9288 91.0 182 0.8722 0.6857
0.9288 92.0 184 0.8694 0.6714
0.9288 93.0 186 0.8670 0.6714
0.9288 94.0 188 0.8645 0.6857
0.9039 95.0 190 0.8624 0.6857
0.9039 96.0 192 0.8603 0.6714
0.9039 97.0 194 0.8584 0.6857
0.9039 98.0 196 0.8566 0.6857
0.9039 99.0 198 0.8553 0.6857
0.9081 100.0 200 0.8550 0.6857
0.9081 101.0 202 0.8551 0.6857
0.9081 102.0 204 0.8556 0.6857
0.9081 103.0 206 0.8558 0.6857
0.9081 104.0 208 0.8554 0.6857
0.9142 105.0 210 0.8551 0.6857
0.9142 106.0 212 0.8553 0.6857
0.9142 107.0 214 0.8551 0.6857
0.9142 108.0 216 0.8549 0.6857
0.9142 109.0 218 0.8549 0.6857
0.9347 110.0 220 0.8551 0.6714
0.9347 111.0 222 0.8554 0.6714
0.9347 112.0 224 0.8548 0.6714
0.9347 113.0 226 0.8538 0.6714
0.9347 114.0 228 0.8525 0.6714
0.8922 115.0 230 0.8512 0.6857
0.8922 116.0 232 0.8505 0.6857
0.8922 117.0 234 0.8495 0.6857
0.8922 118.0 236 0.8484 0.6857
0.8922 119.0 238 0.8472 0.6857
0.8897 120.0 240 0.8456 0.6857
0.8897 121.0 242 0.8440 0.6857
0.8897 122.0 244 0.8426 0.6714
0.8897 123.0 246 0.8412 0.6857
0.8897 124.0 248 0.8396 0.6857
0.8829 125.0 250 0.8384 0.6857
0.8829 126.0 252 0.8373 0.6857
0.8829 127.0 254 0.8365 0.6857
0.8829 128.0 256 0.8360 0.6857
0.8829 129.0 258 0.8353 0.6857
0.8744 130.0 260 0.8344 0.6857
0.8744 131.0 262 0.8337 0.6714
0.8744 132.0 264 0.8329 0.6857
0.8744 133.0 266 0.8325 0.6857
0.8744 134.0 268 0.8318 0.6857
0.8657 135.0 270 0.8312 0.6857
0.8657 136.0 272 0.8306 0.6714
0.8657 137.0 274 0.8300 0.6714
0.8657 138.0 276 0.8296 0.6714
0.8657 139.0 278 0.8294 0.6714
0.9421 140.0 280 0.8292 0.6714
0.9421 141.0 282 0.8291 0.6714
0.9421 142.0 284 0.8290 0.6714
0.9421 143.0 286 0.8290 0.6857
0.9421 144.0 288 0.8289 0.6857
0.9066 145.0 290 0.8287 0.6857
0.9066 146.0 292 0.8290 0.6857
0.9066 147.0 294 0.8293 0.6857
0.9066 148.0 296 0.8294 0.6857
0.9066 149.0 298 0.8295 0.6857
0.9068 150.0 300 0.8295 0.6857
0.9068 151.0 302 0.8294 0.6857
0.9068 152.0 304 0.8293 0.6857
0.9068 153.0 306 0.8293 0.6857
0.9068 154.0 308 0.8290 0.6857
0.8715 155.0 310 0.8287 0.6857
0.8715 156.0 312 0.8283 0.6857
0.8715 157.0 314 0.8277 0.6857
0.8715 158.0 316 0.8274 0.6857
0.8715 159.0 318 0.8269 0.6857
0.8921 160.0 320 0.8266 0.6857
0.8921 161.0 322 0.8264 0.6857
0.8921 162.0 324 0.8261 0.6857
0.8921 163.0 326 0.8260 0.6857
0.8921 164.0 328 0.8258 0.6857
0.8768 165.0 330 0.8252 0.6857
0.8768 166.0 332 0.8248 0.6857
0.8768 167.0 334 0.8243 0.6857
0.8768 168.0 336 0.8237 0.6857
0.8768 169.0 338 0.8231 0.6857
0.8519 170.0 340 0.8227 0.6857
0.8519 171.0 342 0.8223 0.6857
0.8519 172.0 344 0.8221 0.6857
0.8519 173.0 346 0.8220 0.6857
0.8519 174.0 348 0.8218 0.6857
0.92 175.0 350 0.8215 0.6857
0.92 176.0 352 0.8211 0.7
0.92 177.0 354 0.8207 0.7
0.92 178.0 356 0.8204 0.7
0.92 179.0 358 0.8200 0.7
0.879 180.0 360 0.8197 0.7
0.879 181.0 362 0.8194 0.7
0.879 182.0 364 0.8191 0.6857
0.879 183.0 366 0.8187 0.6857
0.879 184.0 368 0.8185 0.7
0.8893 185.0 370 0.8182 0.7
0.8893 186.0 372 0.8180 0.7
0.8893 187.0 374 0.8177 0.7
0.8893 188.0 376 0.8176 0.7
0.8893 189.0 378 0.8175 0.7
0.8501 190.0 380 0.8173 0.7
0.8501 191.0 382 0.8171 0.7
0.8501 192.0 384 0.8170 0.7
0.8501 193.0 386 0.8169 0.7
0.8501 194.0 388 0.8169 0.7
0.8611 195.0 390 0.8168 0.7
0.8611 196.0 392 0.8168 0.7
0.8611 197.0 394 0.8168 0.7
0.8611 198.0 396 0.8168 0.7
0.8611 199.0 398 0.8168 0.7
0.8881 200.0 400 0.8168 0.7

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3