Edit model card

swin-tiny-patch4-window7-224-finetuned-tekno24-highdata-90-3rd

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9108
  • Accuracy: 0.6221
  • F1: 0.6116
  • Precision: 0.6170
  • Recall: 0.6221

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.3622 0.9786 40 1.3459 0.3687 0.2453 0.2574 0.3687
1.335 1.9817 81 1.3295 0.3779 0.3635 0.3592 0.3779
1.2404 2.9847 122 1.1256 0.5023 0.4893 0.5048 0.5023
1.2113 3.9878 163 1.1081 0.5346 0.4918 0.5409 0.5346
1.1617 4.9908 204 1.0667 0.5300 0.4938 0.5204 0.5300
1.1758 5.9939 245 1.1505 0.4747 0.4713 0.5074 0.4747
1.1618 6.9969 286 1.1316 0.4931 0.4779 0.4950 0.4931
1.1748 8.0 327 1.0681 0.5161 0.4827 0.5256 0.5161
1.1421 8.9786 367 0.9743 0.5714 0.5445 0.5488 0.5714
1.1565 9.9817 408 0.9705 0.5622 0.5142 0.5429 0.5622
1.1297 10.9847 449 0.9879 0.5530 0.5365 0.5343 0.5530
1.1249 11.9878 490 0.9852 0.5760 0.5401 0.6055 0.5760
1.1289 12.9908 531 0.9555 0.5714 0.5363 0.5409 0.5714
1.1102 13.9939 572 0.9438 0.5991 0.5795 0.6033 0.5991
1.1011 14.9969 613 0.9492 0.5991 0.5840 0.6016 0.5991
1.1293 16.0 654 0.9826 0.5714 0.5548 0.6000 0.5714
1.0706 16.9786 694 0.9465 0.5945 0.5739 0.6001 0.5945
1.0825 17.9817 735 0.9268 0.6083 0.5861 0.6083 0.6083
1.0989 18.9847 776 0.9349 0.6083 0.5900 0.6168 0.6083
1.0541 19.9878 817 0.9408 0.6175 0.6103 0.6332 0.6175
1.0883 20.9908 858 0.9108 0.6221 0.6116 0.6170 0.6221
1.0828 21.9939 899 0.9412 0.5991 0.5913 0.6205 0.5991
1.0492 22.9969 940 0.9263 0.6129 0.6003 0.6442 0.6129
1.0486 24.0 981 0.9254 0.6129 0.6069 0.6137 0.6129
1.0648 24.9786 1021 0.9165 0.5991 0.5880 0.6002 0.5991
1.079 25.9817 1062 0.9294 0.5899 0.5795 0.6006 0.5899
1.0459 26.9847 1103 0.9237 0.6083 0.5953 0.6168 0.6083
1.057 27.9878 1144 0.9233 0.6083 0.5953 0.6168 0.6083
1.0496 28.9908 1185 0.9237 0.5991 0.5874 0.6077 0.5991
1.0509 29.3578 1200 0.9238 0.5991 0.5874 0.6077 0.5991

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for BTX24/swin-tiny-patch4-window7-224-finetuned-tekno24-highdata-90-3rd

Finetuned
this model