Edit model card

vit-base-patch16-224-dmae-va-da-40C

This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3208
  • Accuracy: 0.9302

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.3666 0.2093
No log 1.85 6 1.3164 0.2558
1.3006 2.77 9 1.2166 0.4186
1.3006 4.0 13 0.9618 0.5581
0.9554 4.92 16 0.8278 0.6279
0.9554 5.85 19 0.7054 0.7442
0.9554 6.77 22 0.6724 0.7209
0.6343 8.0 26 0.6016 0.7442
0.6343 8.92 29 0.5518 0.7674
0.4376 9.85 32 0.4945 0.8140
0.4376 10.77 35 0.5047 0.8140
0.4376 12.0 39 0.4657 0.8372
0.2915 12.92 42 0.4190 0.8372
0.2915 13.85 45 0.4187 0.8837
0.2197 14.77 48 0.3822 0.8837
0.2197 16.0 52 0.3720 0.8605
0.2197 16.92 55 0.3161 0.8605
0.2065 17.85 58 0.3437 0.8605
0.2065 18.77 61 0.3175 0.8605
0.1273 20.0 65 0.3571 0.8837
0.1273 20.92 68 0.3465 0.8837
0.1273 21.85 71 0.3042 0.8837
0.1164 22.77 74 0.3009 0.8837
0.1164 24.0 78 0.3373 0.9070
0.1154 24.92 81 0.2979 0.9070
0.1154 25.85 84 0.2799 0.9070
0.1154 26.77 87 0.2848 0.9070
0.1331 28.0 91 0.3093 0.9070
0.1331 28.92 94 0.3208 0.9302
0.0881 29.85 97 0.2996 0.9302
0.0881 30.77 100 0.2708 0.9302
0.0862 32.0 104 0.2588 0.9302
0.0862 32.92 107 0.2619 0.9302
0.0862 33.85 110 0.2761 0.9070
0.0578 34.77 113 0.2898 0.9070
0.0578 36.0 117 0.2994 0.9070
0.1087 36.92 120 0.3013 0.9070

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Augusto777/vit-base-patch16-224-dmae-va-da-40C

Finetuned
(497)
this model