tangg555's picture
Model save
45c75d4 verified
metadata
library_name: transformers
base_model: openai/clip-vit-large-patch14
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: clip-vit-large-patch14-finetuned-clip-vit-large-patch14-mnist_linear_probe
    results: []

clip-vit-large-patch14-finetuned-clip-vit-large-patch14-mnist_linear_probe

This model is a fine-tuned version of openai/clip-vit-large-patch14 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9687
  • Accuracy: 0.9137

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.4414 0.9953 105 2.3838 0.102
2.2431 2.0 211 2.2100 0.2698
2.1251 2.9953 316 2.0662 0.4987
2.0119 4.0 422 1.9232 0.6905
1.9134 4.9953 527 1.8080 0.7645
1.8314 6.0 633 1.7059 0.8053
1.7816 6.9953 738 1.6175 0.8215
1.7076 8.0 844 1.5373 0.845
1.6632 8.9953 949 1.4675 0.8592
1.6188 10.0 1055 1.4062 0.863
1.5606 10.9953 1160 1.3510 0.8718
1.5185 12.0 1266 1.3031 0.8718
1.5007 12.9953 1371 1.2591 0.881
1.4573 14.0 1477 1.2201 0.8833
1.4474 14.9953 1582 1.1841 0.8875
1.4308 16.0 1688 1.1524 0.8925
1.4091 16.9953 1793 1.1246 0.8943
1.3683 18.0 1899 1.0986 0.8985
1.365 18.9953 2004 1.0752 0.9042
1.3635 20.0 2110 1.0563 0.9033
1.3422 20.9953 2215 1.0389 0.9043
1.3248 22.0 2321 1.0231 0.9083
1.2961 22.9953 2426 1.0100 0.9093
1.3136 24.0 2532 0.9986 0.9107
1.3067 24.9953 2637 0.9897 0.911
1.2984 26.0 2743 0.9818 0.9115
1.3045 26.9953 2848 0.9759 0.9132
1.291 28.0 2954 0.9717 0.9132
1.2731 28.9953 3059 0.9692 0.9142
1.3034 29.8578 3150 0.9687 0.9137

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1