Edit model card

MAE-CT-M1N0-M12_v8_split4

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4312
  • Accuracy: 0.8267

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6400

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6773 0.0102 65 0.7109 0.4348
0.7393 1.0102 130 0.7720 0.4348
0.6483 2.0102 195 0.8131 0.4348
0.5872 3.0102 260 0.7178 0.4348
0.5612 4.0102 325 0.6203 0.6957
0.2855 5.0102 390 0.7647 0.3913
0.3332 6.0102 455 0.9563 0.3913
0.5376 7.0102 520 1.0380 0.4348
0.3236 8.0102 585 0.6013 0.7826
0.2583 9.0102 650 0.6642 0.6957
0.519 10.0102 715 0.8797 0.6522
0.2594 11.0102 780 0.8123 0.7391
0.2015 12.0102 845 1.2630 0.6522
0.3333 13.0102 910 1.4962 0.6087
0.1593 14.0102 975 1.1972 0.6957
0.1296 15.0102 1040 1.1893 0.7826
0.3097 16.0102 1105 1.5245 0.7391
0.1145 17.0102 1170 1.2979 0.7826
0.2288 18.0102 1235 1.7658 0.6957
0.0217 19.0102 1300 2.6377 0.6087
0.1368 20.0102 1365 1.6947 0.6957
0.1717 21.0102 1430 1.8905 0.6522
0.0014 22.0102 1495 2.1503 0.6522
0.012 23.0102 1560 2.0506 0.6522
0.0007 24.0102 1625 2.3373 0.6522
0.0001 25.0102 1690 1.6162 0.7391
0.0002 26.0102 1755 2.7662 0.6087
0.104 27.0102 1820 1.5637 0.7826
0.1848 28.0102 1885 3.6887 0.5217
0.0015 29.0102 1950 1.7133 0.6957
0.0001 30.0102 2015 2.1864 0.7391
0.0008 31.0102 2080 1.9452 0.7391
0.0002 32.0102 2145 1.7982 0.7391
0.0001 33.0102 2210 2.3272 0.6957
0.0072 34.0102 2275 2.5865 0.6957
0.275 35.0102 2340 4.0065 0.5652
0.0004 36.0102 2405 1.4350 0.7826
0.0001 37.0102 2470 1.8396 0.7826
0.1562 38.0102 2535 2.6788 0.6522
0.0001 39.0102 2600 2.0010 0.6957
0.0001 40.0102 2665 2.4220 0.6522
0.1117 41.0102 2730 2.3290 0.6957
0.0001 42.0102 2795 3.1235 0.5652
0.0001 43.0102 2860 2.9064 0.6087
0.0003 44.0102 2925 3.1359 0.6087
0.0007 45.0102 2990 3.1225 0.6087
0.0031 46.0102 3055 2.9252 0.6087
0.0 47.0102 3120 3.3919 0.5652
0.0003 48.0102 3185 2.8240 0.6957
0.0014 49.0102 3250 2.4431 0.5652
0.0001 50.0102 3315 2.2488 0.6957
0.0 51.0102 3380 2.6169 0.6087
0.0 52.0102 3445 2.4118 0.7391
0.0002 53.0102 3510 2.4928 0.5652
0.0001 54.0102 3575 3.6149 0.5652
0.0 55.0102 3640 3.2978 0.5652
0.0 56.0102 3705 2.9060 0.5217
0.1108 57.0102 3770 3.0361 0.6087
0.0 58.0102 3835 3.3929 0.6087
0.0 59.0102 3900 3.5174 0.5652
0.0007 60.0102 3965 2.1117 0.7391
0.0 61.0102 4030 3.5274 0.6087
0.0 62.0102 4095 3.5149 0.6087
0.0 63.0102 4160 3.4865 0.6087
0.0 64.0102 4225 3.2318 0.6087
0.0 65.0102 4290 3.1844 0.6087
0.0 66.0102 4355 3.2181 0.6087
0.0 67.0102 4420 3.2936 0.6087
0.0 68.0102 4485 3.3043 0.6087
0.0 69.0102 4550 3.1360 0.6522
0.0186 70.0102 4615 2.3659 0.7391
0.0 71.0102 4680 2.5226 0.7391
0.0 72.0102 4745 2.7737 0.6522
0.0 73.0102 4810 2.6730 0.6957
0.0 74.0102 4875 2.7865 0.6957
0.0 75.0102 4940 2.7922 0.6957
0.0 76.0102 5005 3.0552 0.6087
0.0 77.0102 5070 2.4933 0.7391
0.0044 78.0102 5135 2.1811 0.7391
0.0 79.0102 5200 1.9051 0.7826
0.0 80.0102 5265 1.8407 0.8261
0.0 81.0102 5330 2.1967 0.7826
0.0 82.0102 5395 2.3231 0.6957
0.0 83.0102 5460 2.3425 0.6957
0.0 84.0102 5525 2.8403 0.5652
0.0 85.0102 5590 2.3424 0.6957
0.0 86.0102 5655 2.4246 0.6957
0.0 87.0102 5720 2.4289 0.6957
0.0 88.0102 5785 2.4310 0.6957
0.0 89.0102 5850 2.4361 0.6957
0.0 90.0102 5915 2.3667 0.6957
0.0 91.0102 5980 2.3627 0.6957
0.0 92.0102 6045 2.3715 0.6957
0.0 93.0102 6110 2.3773 0.7391
0.0 94.0102 6175 2.4264 0.7391
0.0 95.0102 6240 2.4393 0.7391
0.0 96.0102 6305 2.4449 0.7391
0.0 97.0102 6370 2.4451 0.7391
0.0 98.0047 6400 2.4451 0.7391

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
17
Safetensors
Model size
304M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for beingbatman/MAE-CT-M1N0-M12_v8_split4

Finetuned
(16)
this model