beingbatman's picture
Model save
076be1a verified
|
raw
history blame
11.3 kB
metadata
library_name: transformers
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-large-finetuned-kinetics
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: MAE-CT-M1N0-M12_v8_split1_v3
    results: []

MAE-CT-M1N0-M12_v8_split1_v3

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4942
  • Accuracy: 0.7838

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 10500

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6862 0.0068 71 0.6571 0.6622
0.665 1.0068 142 0.6370 0.6622
0.7033 2.0068 213 0.6254 0.6622
0.6524 3.0068 284 0.6091 0.6757
0.5611 4.0068 355 0.5565 0.6622
0.4274 5.0068 426 0.5154 0.7162
0.4797 6.0068 497 0.5644 0.6757
0.3758 7.0068 568 0.4942 0.7838
0.4243 8.0068 639 0.5252 0.7568
0.5133 9.0068 710 0.6873 0.6757
0.3709 10.0068 781 0.6555 0.7568
0.2793 11.0068 852 0.7140 0.7568
0.6153 12.0068 923 1.3006 0.6892
0.7185 13.0068 994 1.6663 0.6892
0.4609 14.0068 1065 1.3522 0.7162
0.236 15.0068 1136 1.2228 0.7297
0.0519 16.0068 1207 1.0973 0.7568
0.0026 17.0068 1278 1.4476 0.7432
0.357 18.0068 1349 1.4487 0.7432
0.4262 19.0068 1420 1.1604 0.7838
0.0021 20.0068 1491 1.7720 0.7027
0.0132 21.0068 1562 1.7388 0.7297
0.1451 22.0068 1633 1.7954 0.6892
0.0099 23.0068 1704 2.1619 0.7162
0.0001 24.0068 1775 1.6524 0.7297
0.0005 25.0068 1846 1.8499 0.7162
0.0388 26.0068 1917 1.8792 0.7027
0.1798 27.0068 1988 1.2951 0.7568
0.2354 28.0068 2059 1.5408 0.7297
0.0024 29.0068 2130 1.9224 0.7162
0.0018 30.0068 2201 2.5244 0.6486
0.1072 31.0068 2272 2.8444 0.6486
0.0664 32.0068 2343 1.8277 0.7297
0.0122 33.0068 2414 2.1148 0.7297
0.1118 34.0068 2485 1.5536 0.7703
0.1987 35.0068 2556 2.2923 0.7027
0.0012 36.0068 2627 2.6785 0.6622
0.0027 37.0068 2698 2.2400 0.6757
0.0002 38.0068 2769 2.2459 0.7162
0.0099 39.0068 2840 2.3601 0.6622
0.0071 40.0068 2911 2.0561 0.7297
0.0086 41.0068 2982 2.1898 0.7027
0.0131 42.0068 3053 2.6086 0.6757
0.0002 43.0068 3124 2.1400 0.6892
0.0001 44.0068 3195 2.2608 0.7027
0.0549 45.0068 3266 2.0129 0.7432
0.0 46.0068 3337 2.0018 0.7297
0.0001 47.0068 3408 1.7209 0.7838
0.31 48.0068 3479 2.1962 0.7027
0.0001 49.0068 3550 1.6650 0.7568
0.0 50.0068 3621 1.8843 0.7568
0.0 51.0068 3692 1.9398 0.7703
0.0 52.0068 3763 1.7851 0.7568
0.0001 53.0068 3834 1.9574 0.7162
0.0002 54.0068 3905 2.6200 0.6351
0.0 55.0068 3976 2.2333 0.7027
0.0001 56.0068 4047 2.7799 0.6757
0.0001 57.0068 4118 2.1935 0.7027
0.0188 58.0068 4189 2.2272 0.7162
0.1013 59.0068 4260 2.3607 0.7027
0.0001 60.0068 4331 2.1223 0.7432
0.0026 61.0068 4402 1.9220 0.7568
0.193 62.0068 4473 2.2254 0.7027
0.0002 63.0068 4544 2.2682 0.6622
0.0 64.0068 4615 2.6857 0.6892
0.0 65.0068 4686 2.3791 0.7297
0.0076 66.0068 4757 2.8393 0.6757
0.0043 67.0068 4828 1.9305 0.7162
0.0003 68.0068 4899 1.9944 0.7297
0.0 69.0068 4970 2.5842 0.7162
0.0001 70.0068 5041 2.6503 0.6622
0.0 71.0068 5112 2.7254 0.6757
0.0002 72.0068 5183 3.0429 0.6622
0.0 73.0068 5254 2.5716 0.7027
0.0671 74.0068 5325 2.5144 0.7027
0.0 75.0068 5396 2.8938 0.6622
0.0 76.0068 5467 2.8503 0.6622
0.0 77.0068 5538 2.8861 0.6622
0.0 78.0068 5609 2.8524 0.6892
0.0 79.0068 5680 2.7962 0.6757
0.0 80.0068 5751 2.8640 0.6622
0.0 81.0068 5822 2.8446 0.6757
0.0 82.0068 5893 2.6401 0.6892
0.0007 83.0068 5964 2.3987 0.7432
0.0 84.0068 6035 2.3642 0.7162
0.0 85.0068 6106 2.4710 0.6757
0.0004 86.0068 6177 3.0323 0.6486
0.0 87.0068 6248 3.0862 0.6351
0.2299 88.0068 6319 2.0283 0.7703
0.0 89.0068 6390 2.3752 0.6892
0.1842 90.0068 6461 2.2107 0.7568
0.0002 91.0068 6532 3.1361 0.6622
0.0 92.0068 6603 2.7366 0.7027
0.0 93.0068 6674 2.6850 0.6892
0.0 94.0068 6745 2.6965 0.6892
0.1894 95.0068 6816 2.5070 0.7162
0.0 96.0068 6887 2.5327 0.7162
0.0 97.0068 6958 2.8845 0.6892
0.0 98.0068 7029 2.0030 0.7838
0.1439 99.0068 7100 2.7892 0.6892
0.0 100.0068 7171 2.4622 0.7162
0.0016 101.0068 7242 2.4540 0.7297
0.0006 102.0068 7313 2.4853 0.7162
0.0 103.0068 7384 2.5101 0.7297
0.0 104.0068 7455 2.5136 0.7162
0.0 105.0068 7526 2.5028 0.7297
0.0039 106.0068 7597 2.6882 0.7162
0.0 107.0068 7668 2.8377 0.7162
0.0 108.0068 7739 2.8495 0.7162
0.0073 109.0068 7810 2.6625 0.7297
0.0 110.0068 7881 2.7063 0.7162
0.0 111.0068 7952 2.3949 0.7568
0.0 112.0068 8023 2.5956 0.7432
0.0001 113.0068 8094 2.9212 0.7027
0.0 114.0068 8165 2.8216 0.6892
0.0 115.0068 8236 2.8409 0.6892
0.0 116.0068 8307 2.8546 0.6892
0.0 117.0068 8378 2.8172 0.6892
0.0 118.0068 8449 2.4546 0.7432
0.0 119.0068 8520 2.3815 0.7568
0.0 120.0068 8591 2.4006 0.7432
0.0 121.0068 8662 2.4198 0.7432
0.0 122.0068 8733 2.4389 0.7432
0.0 123.0068 8804 2.4763 0.7432
0.0 124.0068 8875 2.4947 0.7432
0.0 125.0068 8946 2.5126 0.7297
0.0 126.0068 9017 2.5314 0.7297
0.0 127.0068 9088 2.5429 0.7297
0.0 128.0068 9159 2.5660 0.7297
0.0 129.0068 9230 2.5828 0.7162
0.0 130.0068 9301 2.5996 0.7162
0.0 131.0068 9372 2.6081 0.7162
0.0 132.0068 9443 2.6265 0.7162
0.0 133.0068 9514 2.6524 0.7162
0.0 134.0068 9585 2.6634 0.7162
0.0 135.0068 9656 2.6925 0.7162
0.0 136.0068 9727 2.7701 0.7162
0.0 137.0068 9798 2.7774 0.7027
0.0 138.0068 9869 2.7756 0.7162
0.0 139.0068 9940 2.7789 0.7162
0.0 140.0068 10011 2.7818 0.7162
0.0 141.0068 10082 2.7164 0.7162
0.0 142.0068 10153 2.9571 0.7027
0.0 143.0068 10224 2.9562 0.7027
0.0 144.0068 10295 2.9538 0.7027
0.0 145.0068 10366 2.9514 0.7027
0.0 146.0068 10437 2.9517 0.7027
0.0 147.006 10500 2.9518 0.7027

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0