Edit model card

Dinotron

This model is a fine-tuned version of facebook/dinov2-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0265
  • Accuracy: 0.9932

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 7 0.1146 0.9638
0.3773 2.0 14 0.0336 0.9932
0.0541 3.0 21 0.0402 0.9887
0.0541 4.0 28 0.0463 0.9887
0.0476 5.0 35 0.0594 0.9819
0.1408 6.0 42 0.1296 0.9570
0.1408 7.0 49 0.0872 0.9729
0.0898 8.0 56 0.2245 0.9344
0.216 9.0 63 0.1444 0.9570
0.076 10.0 70 0.0316 0.9887
0.076 11.0 77 0.0411 0.9864
0.0369 12.0 84 0.0275 0.9887
0.0505 13.0 91 0.1610 0.9638
0.0505 14.0 98 0.0513 0.9910
0.0274 15.0 105 0.2366 0.9615
0.0735 16.0 112 0.0738 0.9796
0.0735 17.0 119 0.0529 0.9819
0.0334 18.0 126 0.1024 0.9661
0.0347 19.0 133 0.0919 0.9819
0.0206 20.0 140 0.0851 0.9864
0.0206 21.0 147 0.1004 0.9796
0.0516 22.0 154 0.1706 0.9638
0.0418 23.0 161 0.0505 0.9910
0.0418 24.0 168 0.0939 0.9774
0.0173 25.0 175 0.0553 0.9842
0.0239 26.0 182 0.1255 0.9796
0.0239 27.0 189 0.2256 0.9661
0.0286 28.0 196 0.0943 0.9751
0.0502 29.0 203 0.0937 0.9751
0.0102 30.0 210 0.0910 0.9842
0.0102 31.0 217 0.0336 0.9887
0.0182 32.0 224 0.0870 0.9796
0.0126 33.0 231 0.0565 0.9842
0.0126 34.0 238 0.0541 0.9842
0.0157 35.0 245 0.0591 0.9932
0.0059 36.0 252 0.0985 0.9819
0.0059 37.0 259 0.0813 0.9819
0.0092 38.0 266 0.0239 0.9955
0.0225 39.0 273 0.0982 0.9706
0.0105 40.0 280 0.0113 0.9955
0.0105 41.0 287 0.0127 0.9977
0.007 42.0 294 0.0760 0.9887
0.0032 43.0 301 0.0196 0.9932
0.0032 44.0 308 0.0171 0.9932
0.0206 45.0 315 0.0501 0.9910
0.0001 46.0 322 0.0925 0.9842
0.0001 47.0 329 0.0318 0.9910
0.0017 48.0 336 0.0612 0.9864
0.0023 49.0 343 0.0685 0.9864
0.0013 50.0 350 0.0265 0.9932

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for frncscp/dinotron

Finetuned
(22)
this model

Space using frncscp/dinotron 1

Collection including frncscp/dinotron