Edit model card

MAE-CT-M1N0-M12_v8_split5_v3

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1517
  • Accuracy: 0.8701

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 10350

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.685 0.0068 70 0.6757 0.7792
0.5601 1.0068 140 0.6218 0.6234
0.6632 2.0068 210 0.6157 0.6234
0.5153 3.0068 280 0.5660 0.6364
0.5008 4.0068 350 0.5238 0.7662
0.4879 5.0068 420 0.5012 0.7792
0.3636 6.0068 490 0.5640 0.7013
0.7238 7.0068 560 0.5756 0.7013
0.3339 8.0068 630 0.9895 0.6883
0.4152 9.0068 700 0.5031 0.8182
0.3126 10.0068 770 0.5350 0.7273
0.4479 11.0068 840 0.4278 0.8312
0.5548 12.0068 910 0.6865 0.7013
0.1509 13.0068 980 0.8144 0.7143
0.4038 14.0068 1050 0.6039 0.7922
0.2748 15.0068 1120 1.1834 0.7662
0.4552 16.0068 1190 0.7594 0.7532
0.5584 17.0068 1260 0.9481 0.7922
0.0919 18.0068 1330 1.0080 0.7662
0.2309 19.0068 1400 0.8453 0.8182
0.191 20.0068 1470 1.0695 0.7662
0.2013 21.0068 1540 1.4657 0.7403
0.6645 22.0068 1610 1.0602 0.8052
0.1083 23.0068 1680 1.2148 0.7532
0.0885 24.0068 1750 1.2008 0.7792
0.0015 25.0068 1820 1.2987 0.7532
0.2372 26.0068 1890 1.6225 0.7532
0.001 27.0068 1960 1.1689 0.7662
0.0006 28.0068 2030 1.3817 0.7532
0.0002 29.0068 2100 1.7143 0.7273
0.0012 30.0068 2170 1.8865 0.7273
0.153 31.0068 2240 2.4574 0.6623
0.1308 32.0068 2310 1.1800 0.8052
0.0002 33.0068 2380 1.2817 0.7792
0.0001 34.0068 2450 1.2770 0.7792
0.0001 35.0068 2520 1.2779 0.7922
0.0001 36.0068 2590 1.3971 0.7792
0.0001 37.0068 2660 1.1263 0.8182
0.0001 38.0068 2730 1.1233 0.8182
0.0675 39.0068 2800 1.4885 0.7662
0.0002 40.0068 2870 1.8406 0.7013
0.0001 41.0068 2940 1.9085 0.7532
0.0005 42.0068 3010 1.9380 0.7143
0.1589 43.0068 3080 0.9674 0.8312
0.0001 44.0068 3150 1.5574 0.7403
0.0353 45.0068 3220 1.1688 0.8312
0.0001 46.0068 3290 1.7684 0.7143
0.0002 47.0068 3360 1.3363 0.7792
0.1237 48.0068 3430 1.2230 0.7922
0.0001 49.0068 3500 1.4665 0.7792
0.0 50.0068 3570 1.5472 0.7662
0.1479 51.0068 3640 2.3369 0.7273
0.0001 52.0068 3710 2.2529 0.6753
0.1081 53.0068 3780 1.4745 0.7273
0.0002 54.0068 3850 1.5813 0.7403
0.0119 55.0068 3920 1.6007 0.7662
0.1478 56.0068 3990 2.3310 0.7143
0.0001 57.0068 4060 1.4788 0.8052
0.0001 58.0068 4130 1.1851 0.8442
0.0001 59.0068 4200 1.1920 0.8571
0.0904 60.0068 4270 1.1858 0.8312
0.0001 61.0068 4340 1.4534 0.7662
0.0017 62.0068 4410 1.6716 0.7792
0.0001 63.0068 4480 2.2017 0.6883
0.3407 64.0068 4550 1.2424 0.8052
0.0001 65.0068 4620 1.5786 0.7792
0.0002 66.0068 4690 1.3379 0.8182
0.0005 67.0068 4760 1.1517 0.8701
0.0 68.0068 4830 1.5294 0.7792
0.0 69.0068 4900 2.4381 0.6883
0.0032 70.0068 4970 1.7952 0.7532
0.0 71.0068 5040 3.0253 0.6753
0.214 72.0068 5110 1.9327 0.7143
0.0 73.0068 5180 2.0236 0.7532
0.0 74.0068 5250 1.9076 0.7662
0.0 75.0068 5320 1.7070 0.8052
0.0003 76.0068 5390 1.8621 0.7532
0.0 77.0068 5460 1.8847 0.7662
0.0 78.0068 5530 1.8880 0.7662
0.0001 79.0068 5600 1.8182 0.7792
0.0 80.0068 5670 1.7965 0.8052
0.0001 81.0068 5740 3.0536 0.6753
0.0005 82.0068 5810 1.5427 0.7922
0.0006 83.0068 5880 1.8892 0.7403
0.0001 84.0068 5950 1.9648 0.7403
0.0 85.0068 6020 1.7625 0.7532
0.1655 86.0068 6090 1.6751 0.7662
0.0 87.0068 6160 1.8559 0.7403
0.0 88.0068 6230 1.8886 0.7532
0.0 89.0068 6300 1.8957 0.7532
0.0 90.0068 6370 1.8181 0.7662
0.0 91.0068 6440 1.8299 0.7532
0.0 92.0068 6510 1.5186 0.8182
0.0393 93.0068 6580 1.9234 0.7792
0.0 94.0068 6650 2.1199 0.7273
0.0 95.0068 6720 2.1309 0.7403
0.0009 96.0068 6790 1.9311 0.7532
0.0001 97.0068 6860 1.7858 0.7792
0.0894 98.0068 6930 1.5577 0.8052
0.0 99.0068 7000 1.8138 0.7792
0.0 100.0068 7070 2.0068 0.7532
0.0163 101.0068 7140 1.8340 0.7922
0.0 102.0068 7210 1.3226 0.8312
0.0 103.0068 7280 2.4607 0.7532
0.0683 104.0068 7350 1.7550 0.7922
0.0 105.0068 7420 1.4900 0.8312
0.0 106.0068 7490 1.5684 0.7662
0.0 107.0068 7560 1.7333 0.8052
0.0 108.0068 7630 1.4233 0.7922
0.0001 109.0068 7700 1.7542 0.7792
0.0 110.0068 7770 1.4554 0.8052
0.0 111.0068 7840 1.3538 0.8571
0.0 112.0068 7910 1.4165 0.8571
0.0 113.0068 7980 1.4229 0.8571
0.0 114.0068 8050 1.4191 0.8571
0.0 115.0068 8120 1.4364 0.8571
0.0 116.0068 8190 1.4575 0.8312
0.0 117.0068 8260 1.4640 0.8312
0.0 118.0068 8330 1.4807 0.8312
0.0 119.0068 8400 1.5030 0.8312
0.0 120.0068 8470 1.5188 0.8312
0.0 121.0068 8540 1.5642 0.8182
0.0 122.0068 8610 1.5663 0.8182
0.0 123.0068 8680 1.5686 0.8182
0.0 124.0068 8750 1.4284 0.8571
0.0 125.0068 8820 1.4352 0.8571
0.0 126.0068 8890 1.4392 0.8571
0.0 127.0068 8960 1.5200 0.8442
0.0 128.0068 9030 1.5244 0.8442
0.0 129.0068 9100 1.5282 0.8442
0.0 130.0068 9170 1.5338 0.8442
0.0 131.0068 9240 1.5489 0.8442
0.0 132.0068 9310 1.5530 0.8442
0.0 133.0068 9380 1.5586 0.8442
0.0 134.0068 9450 1.5642 0.8442
0.0 135.0068 9520 1.5596 0.8442
0.0 136.0068 9590 1.5681 0.8442
0.0 137.0068 9660 1.4498 0.8182
0.0 138.0068 9730 1.6159 0.8312
0.0 139.0068 9800 1.6950 0.8182
0.0 140.0068 9870 1.6978 0.8182
0.0 141.0068 9940 1.6985 0.8182
0.0 142.0068 10010 1.6995 0.8182
0.0 143.0068 10080 1.7037 0.8052
0.0 144.0068 10150 1.7056 0.8052
0.0 145.0068 10220 1.7054 0.8052
0.0 146.0068 10290 1.7054 0.8052
0.0 147.0058 10350 1.7041 0.8052

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
20
Safetensors
Model size
304M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for beingbatman/MAE-CT-M1N0-M12_v8_split5_v3

Finetuned
(19)
this model