SL-CvT / README.md
LamaAldakhil's picture
update model card README.md
a6550b0
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - f1
  - accuracy
model-index:
  - name: SL-CvT
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: F1
            type: f1
            value: 0.9297928229609359
          - name: Accuracy
            type: accuracy
            value: 0.9316640584246219

SL-CvT

This model is a fine-tuned version of microsoft/cvt-13 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3430
  • F1: 0.9298
  • Roc Auc: 0.9777
  • Accuracy: 0.9317

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss F1 Roc Auc Accuracy
1.2379 1.0 60 1.0716 0.6422 0.7323 0.7246
1.0186 2.0 120 0.8477 0.6425 0.7879 0.7293
0.9433 3.0 180 0.7473 0.7060 0.8454 0.7538
0.8644 4.0 240 0.6831 0.7188 0.8696 0.7663
0.7985 5.0 300 0.6420 0.7409 0.8943 0.7799
0.7322 6.0 360 0.5713 0.7886 0.9196 0.8101
0.725 7.0 420 0.5311 0.7989 0.9324 0.8190
0.6529 8.0 480 0.5246 0.7852 0.9404 0.8117
0.6224 9.0 540 0.4598 0.8282 0.9517 0.8440
0.6315 10.0 600 0.4363 0.8457 0.9585 0.8529
0.5651 11.0 660 0.4437 0.8323 0.9564 0.8503
0.574 12.0 720 0.4003 0.8531 0.9617 0.8638
0.5269 13.0 780 0.3901 0.8676 0.9671 0.8722
0.5138 14.0 840 0.3984 0.8607 0.9685 0.8732
0.4839 15.0 900 0.3763 0.8683 0.9701 0.8769
0.463 16.0 960 0.3398 0.8837 0.9718 0.8894
0.4767 17.0 1020 0.3293 0.8846 0.9738 0.8915
0.4985 18.0 1080 0.3350 0.8852 0.9763 0.8863
0.4657 19.0 1140 0.3369 0.8872 0.9746 0.8951
0.4514 20.0 1200 0.3213 0.8880 0.9750 0.8925
0.4207 21.0 1260 0.3175 0.8943 0.9771 0.8978
0.4522 22.0 1320 0.3229 0.8970 0.9767 0.8983
0.4328 23.0 1380 0.3121 0.8948 0.9791 0.8978
0.3942 24.0 1440 0.3111 0.8993 0.9765 0.9030
0.4414 25.0 1500 0.3062 0.9032 0.9763 0.9061
0.3608 26.0 1560 0.3099 0.8997 0.9787 0.9014
0.3729 27.0 1620 0.3050 0.9029 0.9783 0.9082
0.393 28.0 1680 0.2970 0.9090 0.9797 0.9108
0.402 29.0 1740 0.2986 0.9087 0.9793 0.9113
0.3697 30.0 1800 0.3384 0.8968 0.9769 0.9025
0.3502 31.0 1860 0.3035 0.9058 0.9789 0.9103
0.3653 32.0 1920 0.3127 0.9024 0.9788 0.9025
0.3898 33.0 1980 0.3222 0.9050 0.9778 0.9061
0.317 34.0 2040 0.3013 0.9124 0.9798 0.9139
0.3166 35.0 2100 0.3185 0.9095 0.9775 0.9134
0.3771 36.0 2160 0.3067 0.9049 0.9782 0.9066
0.3487 37.0 2220 0.2948 0.9118 0.9801 0.9134
0.3202 38.0 2280 0.2916 0.9168 0.9788 0.9186
0.3163 39.0 2340 0.3149 0.9141 0.9777 0.9155
0.3605 40.0 2400 0.2964 0.9192 0.9797 0.9207
0.3636 41.0 2460 0.3142 0.9111 0.9810 0.9134
0.3454 42.0 2520 0.3133 0.9111 0.9792 0.9113
0.3561 43.0 2580 0.3090 0.9073 0.9804 0.9077
0.3136 44.0 2640 0.3236 0.9144 0.9782 0.9176
0.3529 45.0 2700 0.3054 0.9175 0.9800 0.9202
0.2987 46.0 2760 0.2944 0.9222 0.9802 0.9233
0.2966 47.0 2820 0.3215 0.9201 0.9786 0.9233
0.3203 48.0 2880 0.3150 0.9219 0.9797 0.9244
0.2821 49.0 2940 0.3072 0.9273 0.9800 0.9291
0.2852 50.0 3000 0.3265 0.9155 0.9792 0.9176
0.3544 51.0 3060 0.3175 0.9150 0.9802 0.9150
0.3327 52.0 3120 0.3134 0.9222 0.9802 0.9244
0.2877 53.0 3180 0.3222 0.9154 0.9805 0.9165
0.3089 54.0 3240 0.3045 0.9248 0.9811 0.9259
0.2904 55.0 3300 0.3301 0.9175 0.9787 0.9186
0.2821 56.0 3360 0.3069 0.9206 0.9810 0.9218
0.321 57.0 3420 0.3209 0.9254 0.9800 0.9270
0.2995 58.0 3480 0.3281 0.9202 0.9802 0.9233
0.2683 59.0 3540 0.3263 0.9174 0.9802 0.9202
0.3021 60.0 3600 0.3484 0.9170 0.9788 0.9186
0.3262 61.0 3660 0.3270 0.9151 0.9807 0.9165
0.2329 62.0 3720 0.3280 0.9211 0.9807 0.9233
0.2935 63.0 3780 0.3296 0.9244 0.9807 0.9264
0.2856 64.0 3840 0.3323 0.9209 0.9811 0.9218
0.2829 65.0 3900 0.3390 0.9200 0.9802 0.9218
0.3044 66.0 3960 0.3324 0.9215 0.9799 0.9228
0.2767 67.0 4020 0.3496 0.9150 0.9778 0.9160
0.2936 68.0 4080 0.3378 0.9257 0.9790 0.9275
0.2884 69.0 4140 0.3493 0.9227 0.9790 0.9249
0.2906 70.0 4200 0.3408 0.9259 0.9794 0.9275
0.2542 71.0 4260 0.3559 0.9233 0.9769 0.9249
0.2557 72.0 4320 0.3481 0.9237 0.9779 0.9254
0.2266 73.0 4380 0.3518 0.9208 0.9781 0.9223
0.2771 74.0 4440 0.3544 0.9231 0.9776 0.9254
0.2747 75.0 4500 0.3469 0.9270 0.9780 0.9285
0.2443 76.0 4560 0.3513 0.9216 0.9767 0.9233
0.2859 77.0 4620 0.3456 0.9234 0.9771 0.9254
0.2677 78.0 4680 0.3474 0.9239 0.9780 0.9254
0.2492 79.0 4740 0.3513 0.9235 0.9778 0.9254
0.2532 80.0 4800 0.3524 0.9210 0.9773 0.9233
0.2646 81.0 4860 0.3529 0.9240 0.9784 0.9238
0.2842 82.0 4920 0.3433 0.9260 0.9777 0.9280
0.2872 83.0 4980 0.3584 0.9272 0.9771 0.9285
0.2678 84.0 5040 0.3430 0.9298 0.9777 0.9317
0.2705 85.0 5100 0.3534 0.9268 0.9777 0.9291
0.2605 86.0 5160 0.3574 0.9272 0.9777 0.9296
0.2572 87.0 5220 0.3426 0.9273 0.9781 0.9291
0.2646 88.0 5280 0.3472 0.9234 0.9789 0.9244
0.2831 89.0 5340 0.3433 0.9272 0.9779 0.9291
0.277 90.0 5400 0.3441 0.9263 0.9789 0.9280
0.2584 91.0 5460 0.3432 0.9236 0.9788 0.9249
0.2703 92.0 5520 0.3409 0.9248 0.9789 0.9259
0.2811 93.0 5580 0.3449 0.9215 0.9795 0.9228
0.2786 94.0 5640 0.3465 0.9260 0.9789 0.9280
0.267 95.0 5700 0.3472 0.9260 0.9791 0.9275
0.2695 96.0 5760 0.3500 0.9268 0.9786 0.9285
0.279 97.0 5820 0.3582 0.9249 0.9782 0.9270
0.2774 98.0 5880 0.3486 0.9251 0.9790 0.9270
0.2512 99.0 5940 0.3514 0.9287 0.9786 0.9306
0.2218 100.0 6000 0.3482 0.9269 0.9789 0.9285

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.0+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3