Edit model card

MAE-CT-M1N0-M12_v8_split5

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7761
  • Accuracy: 0.7403

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6200

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6772 0.0102 63 0.6890 0.5185
0.7281 1.0102 126 0.7085 0.5185
0.7105 2.0102 189 0.7149 0.5185
0.5807 3.0102 252 0.7501 0.5185
0.4887 4.0102 315 0.9063 0.5185
0.7261 5.0102 378 0.7094 0.5185
0.6636 6.0102 441 0.6775 0.5185
0.6699 7.0102 504 0.6032 0.6296
0.3001 8.0102 567 1.2154 0.5556
0.2793 9.0102 630 1.1778 0.5556
0.5364 10.0102 693 1.2644 0.6296
0.2033 11.0102 756 0.6939 0.6667
0.4315 12.0102 819 1.1141 0.7037
0.4352 13.0102 882 1.3884 0.5556
0.6593 14.0102 945 1.6278 0.6296
0.6536 15.0102 1008 1.4868 0.6296
0.0819 16.0102 1071 2.1687 0.6667
0.0296 17.0102 1134 2.1161 0.5926
0.347 18.0102 1197 1.6357 0.6667
0.1476 19.0102 1260 2.8088 0.5556
0.0163 20.0102 1323 1.9632 0.5556
0.5644 21.0102 1386 2.1662 0.5185
0.1428 22.0102 1449 3.2763 0.5185
0.0068 23.0102 1512 3.2954 0.5185
0.0002 24.0102 1575 2.7854 0.5556
0.0048 25.0102 1638 3.1844 0.5185
0.0002 26.0102 1701 3.5731 0.4815
0.149 27.0102 1764 3.5418 0.5926
0.0007 28.0102 1827 3.6389 0.5185
0.0005 29.0102 1890 3.1077 0.5926
0.001 30.0102 1953 3.5467 0.5556
0.0003 31.0102 2016 3.7925 0.5556
0.0001 32.0102 2079 3.5266 0.5185
0.0009 33.0102 2142 3.4863 0.5926
0.0001 34.0102 2205 3.7751 0.5185
0.0001 35.0102 2268 3.5210 0.5185
0.2023 36.0102 2331 3.3674 0.5185
0.0001 37.0102 2394 3.9810 0.5185
0.0001 38.0102 2457 3.9022 0.5556
0.0001 39.0102 2520 3.9362 0.5556
0.0001 40.0102 2583 4.3247 0.5556
0.0001 41.0102 2646 4.3731 0.5185
0.0001 42.0102 2709 4.4895 0.5185
0.0002 43.0102 2772 3.7851 0.5556
0.0001 44.0102 2835 3.2867 0.5185
0.1972 45.0102 2898 4.1406 0.5556
0.0001 46.0102 2961 4.5674 0.4815
0.0001 47.0102 3024 3.8698 0.5556
0.0001 48.0102 3087 4.0334 0.4815
0.0001 49.0102 3150 4.1438 0.5185
0.1451 50.0102 3213 3.8312 0.5556
0.0001 51.0102 3276 2.6480 0.6667
0.0001 52.0102 3339 3.6914 0.4815
0.0012 53.0102 3402 3.7909 0.6296
0.0002 54.0102 3465 3.7010 0.5185
0.0001 55.0102 3528 3.7795 0.4815
0.0001 56.0102 3591 3.8197 0.5556
0.055 57.0102 3654 3.8996 0.5556
0.0001 58.0102 3717 4.1382 0.4815
0.0 59.0102 3780 4.3767 0.4444
0.0 60.0102 3843 4.4118 0.4444
0.0 61.0102 3906 4.5173 0.4444
0.0 62.0102 3969 4.5574 0.4444
0.0 63.0102 4032 4.5809 0.4444
0.0 64.0102 4095 4.6070 0.4444
0.0119 65.0102 4158 4.2245 0.5185
0.0001 66.0102 4221 4.0994 0.5185
0.0 67.0102 4284 4.0198 0.4815
0.0 68.0102 4347 4.0467 0.5185
0.0 69.0102 4410 4.0546 0.5185
0.0001 70.0102 4473 3.7226 0.5926
0.0001 71.0102 4536 3.6231 0.4815
0.0 72.0102 4599 3.5370 0.5926
0.0 73.0102 4662 3.9063 0.5185
0.0308 74.0102 4725 3.2158 0.5926
0.0001 75.0102 4788 3.7864 0.5556
0.0 76.0102 4851 3.7783 0.5556
0.0002 77.0102 4914 3.5353 0.6667
0.0 78.0102 4977 3.5016 0.6296
0.0 79.0102 5040 3.8000 0.5185
0.0 80.0102 5103 3.9970 0.5926
0.0 81.0102 5166 4.0152 0.5556
0.0 82.0102 5229 3.6760 0.6296
0.0 83.0102 5292 4.0728 0.5556
0.0 84.0102 5355 4.0964 0.5556
0.0 85.0102 5418 3.7875 0.5185
0.0 86.0102 5481 3.9301 0.5185
0.0 87.0102 5544 4.2208 0.5185
0.0 88.0102 5607 4.3575 0.5185
0.0 89.0102 5670 4.4015 0.5185
0.0 90.0102 5733 4.4241 0.5185
0.0 91.0102 5796 4.4360 0.5185
0.0 92.0102 5859 4.4440 0.5185
0.0 93.0102 5922 4.4219 0.5185
0.0 94.0102 5985 4.4249 0.5185
0.0 95.0102 6048 4.4252 0.5185
0.0 96.0102 6111 4.3802 0.5556
0.0 97.0102 6174 4.3659 0.5556
0.0 98.0042 6200 4.3659 0.5556

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
17
Safetensors
Model size
304M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for beingbatman/MAE-CT-M1N0-M12_v8_split5

Finetuned
(16)
this model