Edit model card

MAE-CT-M1N0-v11

This model is a fine-tuned version of beingbatman/MAE-CT-M1N0-v11 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9107
  • Accuracy: 0.7407

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6600

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.0957 0.0102 67 1.0947 0.6176
0.2452 1.0102 134 1.1772 0.6765
0.5144 2.0102 201 1.5596 0.6471
0.1315 3.0102 268 1.4470 0.5588
0.0398 4.0102 335 1.5700 0.5882
0.1834 5.0102 402 1.2757 0.7059
0.1818 6.0102 469 1.5364 0.7059
0.0496 7.0102 536 1.6521 0.7353
0.0009 8.0102 603 1.8430 0.6765
0.141 9.0102 670 2.2428 0.6176
0.0338 10.0102 737 2.3844 0.6176
0.0353 11.0102 804 2.0932 0.7059
0.3218 12.0102 871 2.0158 0.6471
0.1485 13.0102 938 1.8430 0.6176
0.004 14.0102 1005 2.1836 0.5882
0.1743 15.0102 1072 2.1859 0.6471
0.0932 16.0102 1139 1.6879 0.7353
0.0039 17.0102 1206 2.0492 0.6471
0.2276 18.0102 1273 2.2761 0.5882
0.5771 19.0102 1340 2.5674 0.6471
0.1686 20.0102 1407 2.2761 0.6471
0.3135 21.0102 1474 2.2901 0.6176
0.0004 22.0102 1541 2.1465 0.7353
0.1221 23.0102 1608 2.2524 0.6471
0.0717 24.0102 1675 2.5812 0.5882
0.0763 25.0102 1742 2.9429 0.6176
0.0001 26.0102 1809 2.3691 0.7059
0.3076 27.0102 1876 2.7487 0.6176
0.2701 28.0102 1943 2.1523 0.6471
0.0018 29.0102 2010 2.9072 0.6176
0.096 30.0102 2077 2.9151 0.5882
0.2654 31.0102 2144 2.8686 0.6471
0.0002 32.0102 2211 2.6399 0.6471
0.0002 33.0102 2278 2.7440 0.6471
0.0113 34.0102 2345 2.5598 0.6471
0.0001 35.0102 2412 2.9623 0.5882
0.1484 36.0102 2479 2.5132 0.6765
0.0001 37.0102 2546 2.7195 0.6176
0.0004 38.0102 2613 1.9358 0.7647
0.0 39.0102 2680 2.6720 0.6471
0.0 40.0102 2747 2.9154 0.6176
0.0 41.0102 2814 2.9176 0.6176
0.0 42.0102 2881 2.9299 0.6176
0.0 43.0102 2948 2.8822 0.5882
0.0007 44.0102 3015 3.0881 0.6176
0.0001 45.0102 3082 3.3474 0.5882
0.0 46.0102 3149 3.2999 0.5882
0.0005 47.0102 3216 3.3930 0.5882
0.0001 48.0102 3283 3.4111 0.5882
0.0245 49.0102 3350 3.1914 0.6176
0.0003 50.0102 3417 3.0377 0.6765
0.0024 51.0102 3484 3.2830 0.5882
0.0001 52.0102 3551 2.5779 0.7059
0.0001 53.0102 3618 3.5160 0.5588
0.0 54.0102 3685 3.4892 0.5882
0.0 55.0102 3752 3.3148 0.5882
0.0 56.0102 3819 3.3751 0.5882
0.2024 57.0102 3886 3.4587 0.6176
0.0009 58.0102 3953 3.2979 0.5882
0.0 59.0102 4020 3.5841 0.5882
0.0001 60.0102 4087 3.4411 0.5882
0.0006 61.0102 4154 3.0952 0.6471
0.0 62.0102 4221 3.2242 0.5882
0.0947 63.0102 4288 3.1971 0.6176
0.0 64.0102 4355 3.3249 0.5882
0.0 65.0102 4422 3.5612 0.5882
0.0 66.0102 4489 3.6243 0.5882
0.0 67.0102 4556 3.3772 0.6176
0.0 68.0102 4623 3.4673 0.6176
0.0 69.0102 4690 3.4033 0.6176
0.0 70.0102 4757 3.2678 0.5882
0.0 71.0102 4824 3.2038 0.6471
0.135 72.0102 4891 2.8182 0.6176
0.0 73.0102 4958 3.1443 0.6765
0.0 74.0102 5025 3.7653 0.5882
0.0 75.0102 5092 3.7835 0.5882
0.0 76.0102 5159 3.7721 0.5882
0.0 77.0102 5226 3.7777 0.5882
0.0 78.0102 5293 3.7851 0.5882
0.0 79.0102 5360 3.7914 0.5882
0.0 80.0102 5427 3.7988 0.5882
0.0 81.0102 5494 3.8051 0.5882
0.0 82.0102 5561 3.8111 0.5882
0.0 83.0102 5628 3.8290 0.5882
0.0 84.0102 5695 3.8341 0.5882
0.0 85.0102 5762 3.8388 0.5882
0.0 86.0102 5829 3.8458 0.5882
0.0 87.0102 5896 3.8484 0.5882
0.0 88.0102 5963 3.8243 0.5882
0.0 89.0102 6030 3.8309 0.5882
0.0 90.0102 6097 3.8333 0.5882
0.0 91.0102 6164 3.8336 0.5882
0.0 92.0102 6231 3.8408 0.5882
0.0 93.0102 6298 3.8428 0.5882
0.0 94.0102 6365 3.8271 0.5882
0.0 95.0102 6432 3.8282 0.5882
0.0 96.0102 6499 3.8579 0.5882
0.0028 97.0102 6566 3.8781 0.5882
0.0 98.0052 6600 3.8801 0.5882

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
81
Safetensors
Model size
304M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for beingbatman/MAE-CT-M1N0-v11

Unable to build the model tree, the base model loops to the model itself. Learn more.