mask2former-swin-base-coco-instance-cvppp-a1-ft
This model is a fine-tuned version of facebook/mask2former-swin-base-coco-instance on the fengchen025/cvppp-a1 dataset. It achieves the following results on the evaluation set:
- Loss: 8.1314
- Map: 0.4892
- Map 50: 0.6847
- Map 75: 0.5666
- Map Small: 0.4391
- Map Medium: 0.5228
- Map Large: -1.0
- Mar 1: 0.0532
- Mar 10: 0.4909
- Mar 100: 0.7390
- Mar Small: 0.5736
- Mar Medium: 0.8257
- Mar Large: -1.0
- Map Per Class: 0.4892
- Mar 100 Per Class: 0.7390
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- num_epochs: 50.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Per Class | Mar 100 Per Class |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
39.3962 | 1.0 | 15 | 21.0739 | 0.2211 | 0.4731 | 0.1821 | 0.1131 | 0.2731 | -1.0 | 0.0286 | 0.2740 | 0.5117 | 0.2849 | 0.6307 | -1.0 | 0.2211 | 0.5117 |
19.3166 | 2.0 | 30 | 16.2822 | 0.3542 | 0.5935 | 0.3793 | 0.2599 | 0.4156 | -1.0 | 0.0422 | 0.3883 | 0.5760 | 0.3792 | 0.6792 | -1.0 | 0.3542 | 0.5760 |
16.0327 | 3.0 | 45 | 14.8551 | 0.3826 | 0.6235 | 0.3937 | 0.2844 | 0.4328 | -1.0 | 0.0448 | 0.4208 | 0.6058 | 0.4226 | 0.7020 | -1.0 | 0.3826 | 0.6058 |
14.9212 | 4.0 | 60 | 14.1509 | 0.3870 | 0.6213 | 0.4268 | 0.3053 | 0.4422 | -1.0 | 0.0442 | 0.4136 | 0.5981 | 0.4094 | 0.6970 | -1.0 | 0.3870 | 0.5981 |
14.1777 | 5.0 | 75 | 13.3047 | 0.3968 | 0.6533 | 0.4284 | 0.3201 | 0.4410 | -1.0 | 0.0435 | 0.4156 | 0.6312 | 0.4453 | 0.7287 | -1.0 | 0.3968 | 0.6312 |
13.3037 | 6.0 | 90 | 12.1355 | 0.4169 | 0.6756 | 0.4283 | 0.3320 | 0.4631 | -1.0 | 0.0455 | 0.4422 | 0.6448 | 0.4736 | 0.7347 | -1.0 | 0.4169 | 0.6448 |
12.7922 | 7.0 | 105 | 11.9884 | 0.4204 | 0.6434 | 0.4713 | 0.3444 | 0.4678 | -1.0 | 0.0448 | 0.4506 | 0.6669 | 0.5038 | 0.7525 | -1.0 | 0.4204 | 0.6669 |
11.8428 | 8.0 | 120 | 11.1682 | 0.4221 | 0.6415 | 0.4704 | 0.3662 | 0.4629 | -1.0 | 0.0455 | 0.4532 | 0.6701 | 0.5189 | 0.7495 | -1.0 | 0.4221 | 0.6701 |
11.4887 | 9.0 | 135 | 11.3821 | 0.4237 | 0.6425 | 0.4809 | 0.3503 | 0.4697 | -1.0 | 0.0455 | 0.4571 | 0.6675 | 0.5019 | 0.7545 | -1.0 | 0.4237 | 0.6675 |
11.4137 | 10.0 | 150 | 10.7176 | 0.4526 | 0.6702 | 0.5098 | 0.3592 | 0.5070 | -1.0 | 0.0565 | 0.4610 | 0.6870 | 0.5245 | 0.7723 | -1.0 | 0.4526 | 0.6870 |
11.0166 | 11.0 | 165 | 10.6001 | 0.4586 | 0.6851 | 0.5253 | 0.3706 | 0.5087 | -1.0 | 0.0500 | 0.4571 | 0.6838 | 0.5472 | 0.7554 | -1.0 | 0.4586 | 0.6838 |
10.7357 | 12.0 | 180 | 10.4470 | 0.4478 | 0.6602 | 0.5013 | 0.3750 | 0.4918 | -1.0 | 0.0519 | 0.4688 | 0.6955 | 0.5264 | 0.7842 | -1.0 | 0.4478 | 0.6955 |
10.6097 | 13.0 | 195 | 10.4092 | 0.4738 | 0.6906 | 0.5290 | 0.3608 | 0.5398 | -1.0 | 0.0565 | 0.4695 | 0.6870 | 0.5245 | 0.7723 | -1.0 | 0.4738 | 0.6870 |
10.1015 | 14.0 | 210 | 9.9768 | 0.4372 | 0.6690 | 0.4708 | 0.3689 | 0.4776 | -1.0 | 0.0519 | 0.4597 | 0.6727 | 0.5113 | 0.7574 | -1.0 | 0.4372 | 0.6727 |
10.0756 | 15.0 | 225 | 10.3155 | 0.4536 | 0.6756 | 0.4957 | 0.3664 | 0.5020 | -1.0 | 0.0455 | 0.4649 | 0.7052 | 0.5453 | 0.7891 | -1.0 | 0.4536 | 0.7052 |
9.773 | 16.0 | 240 | 10.4804 | 0.4510 | 0.6772 | 0.5103 | 0.3897 | 0.4925 | -1.0 | 0.0448 | 0.4617 | 0.7104 | 0.5547 | 0.7921 | -1.0 | 0.4510 | 0.7104 |
9.9397 | 17.0 | 255 | 9.5911 | 0.4657 | 0.6811 | 0.5238 | 0.4039 | 0.5043 | -1.0 | 0.0526 | 0.4766 | 0.7058 | 0.5453 | 0.7901 | -1.0 | 0.4657 | 0.7058 |
9.4745 | 18.0 | 270 | 9.4730 | 0.4550 | 0.6912 | 0.4941 | 0.3794 | 0.5014 | -1.0 | 0.0519 | 0.4695 | 0.7058 | 0.5358 | 0.7950 | -1.0 | 0.4550 | 0.7058 |
9.4424 | 19.0 | 285 | 9.5308 | 0.4389 | 0.6686 | 0.4737 | 0.3505 | 0.4841 | -1.0 | 0.0519 | 0.4701 | 0.6825 | 0.4981 | 0.7792 | -1.0 | 0.4389 | 0.6825 |
9.2619 | 20.0 | 300 | 10.2104 | 0.4552 | 0.6874 | 0.5007 | 0.3839 | 0.5102 | -1.0 | 0.0429 | 0.4669 | 0.6838 | 0.5283 | 0.7653 | -1.0 | 0.4552 | 0.6838 |
9.2045 | 21.0 | 315 | 9.6575 | 0.4618 | 0.6738 | 0.5172 | 0.3904 | 0.5113 | -1.0 | 0.0591 | 0.4630 | 0.7006 | 0.5321 | 0.7891 | -1.0 | 0.4618 | 0.7006 |
9.2804 | 22.0 | 330 | 9.4810 | 0.4593 | 0.6918 | 0.5111 | 0.3870 | 0.5076 | -1.0 | 0.0578 | 0.4649 | 0.7110 | 0.5472 | 0.7970 | -1.0 | 0.4593 | 0.7110 |
9.0051 | 23.0 | 345 | 9.6407 | 0.4498 | 0.6726 | 0.4965 | 0.3774 | 0.5037 | -1.0 | 0.0532 | 0.4669 | 0.6994 | 0.5170 | 0.7950 | -1.0 | 0.4498 | 0.6994 |
9.0604 | 24.0 | 360 | 9.4909 | 0.4700 | 0.6807 | 0.5087 | 0.4251 | 0.5149 | -1.0 | 0.0578 | 0.4727 | 0.7039 | 0.5302 | 0.7950 | -1.0 | 0.4700 | 0.7039 |
8.8417 | 25.0 | 375 | 9.5084 | 0.4638 | 0.6895 | 0.5165 | 0.4111 | 0.5066 | -1.0 | 0.0461 | 0.4662 | 0.7065 | 0.5396 | 0.7941 | -1.0 | 0.4638 | 0.7065 |
8.6481 | 26.0 | 390 | 9.2261 | 0.4792 | 0.6964 | 0.5357 | 0.3961 | 0.5348 | -1.0 | 0.0584 | 0.4812 | 0.7195 | 0.5698 | 0.7980 | -1.0 | 0.4792 | 0.7195 |
8.8093 | 27.0 | 405 | 9.0720 | 0.4771 | 0.6920 | 0.5341 | 0.4281 | 0.5160 | -1.0 | 0.0513 | 0.4682 | 0.7175 | 0.5698 | 0.7950 | -1.0 | 0.4771 | 0.7175 |
8.4759 | 28.0 | 420 | 8.7442 | 0.4779 | 0.7007 | 0.5180 | 0.4187 | 0.5212 | -1.0 | 0.0571 | 0.4688 | 0.7214 | 0.5528 | 0.8099 | -1.0 | 0.4779 | 0.7214 |
8.1716 | 29.0 | 435 | 8.7102 | 0.4749 | 0.6801 | 0.5537 | 0.4114 | 0.5306 | -1.0 | 0.0532 | 0.4727 | 0.7149 | 0.5377 | 0.8079 | -1.0 | 0.4749 | 0.7149 |
7.9733 | 30.0 | 450 | 8.6251 | 0.4845 | 0.6957 | 0.5308 | 0.4179 | 0.5290 | -1.0 | 0.0591 | 0.4877 | 0.7195 | 0.5528 | 0.8069 | -1.0 | 0.4845 | 0.7195 |
7.9114 | 31.0 | 465 | 9.0874 | 0.4729 | 0.6975 | 0.5221 | 0.4137 | 0.5087 | -1.0 | 0.0487 | 0.4786 | 0.7156 | 0.5774 | 0.7881 | -1.0 | 0.4729 | 0.7156 |
7.9404 | 32.0 | 480 | 8.4683 | 0.4768 | 0.6950 | 0.5331 | 0.4201 | 0.5192 | -1.0 | 0.0591 | 0.4799 | 0.7234 | 0.5698 | 0.8040 | -1.0 | 0.4768 | 0.7234 |
8.0908 | 33.0 | 495 | 9.1853 | 0.4768 | 0.6974 | 0.5357 | 0.3872 | 0.5308 | -1.0 | 0.0532 | 0.4857 | 0.7143 | 0.5396 | 0.8059 | -1.0 | 0.4768 | 0.7143 |
7.9822 | 34.0 | 510 | 8.5842 | 0.4712 | 0.6871 | 0.5407 | 0.4397 | 0.5086 | -1.0 | 0.0591 | 0.4714 | 0.7247 | 0.5585 | 0.8119 | -1.0 | 0.4712 | 0.7247 |
7.9252 | 35.0 | 525 | 8.9616 | 0.4670 | 0.6817 | 0.5517 | 0.4207 | 0.5048 | -1.0 | 0.0519 | 0.4734 | 0.7208 | 0.5642 | 0.8030 | -1.0 | 0.4670 | 0.7208 |
7.7363 | 36.0 | 540 | 8.7401 | 0.4718 | 0.6898 | 0.5241 | 0.4015 | 0.5241 | -1.0 | 0.0526 | 0.4877 | 0.7221 | 0.5415 | 0.8168 | -1.0 | 0.4718 | 0.7221 |
7.6896 | 37.0 | 555 | 8.8058 | 0.4806 | 0.6912 | 0.5278 | 0.4108 | 0.5280 | -1.0 | 0.0591 | 0.4799 | 0.7305 | 0.5547 | 0.8228 | -1.0 | 0.4806 | 0.7305 |
7.6104 | 38.0 | 570 | 8.1799 | 0.4748 | 0.6876 | 0.5268 | 0.4338 | 0.5144 | -1.0 | 0.0532 | 0.4825 | 0.7247 | 0.5547 | 0.8139 | -1.0 | 0.4748 | 0.7247 |
7.2744 | 39.0 | 585 | 8.4149 | 0.4737 | 0.6780 | 0.5461 | 0.3985 | 0.5228 | -1.0 | 0.0597 | 0.4779 | 0.7201 | 0.5415 | 0.8139 | -1.0 | 0.4737 | 0.7201 |
7.2528 | 40.0 | 600 | 8.5222 | 0.4797 | 0.6828 | 0.5380 | 0.4185 | 0.5213 | -1.0 | 0.0526 | 0.4831 | 0.7234 | 0.5566 | 0.8109 | -1.0 | 0.4797 | 0.7234 |
7.0455 | 41.0 | 615 | 8.4200 | 0.4782 | 0.6931 | 0.5453 | 0.4118 | 0.5284 | -1.0 | 0.0584 | 0.4818 | 0.7247 | 0.5547 | 0.8139 | -1.0 | 0.4782 | 0.7247 |
7.0402 | 42.0 | 630 | 8.1926 | 0.4898 | 0.6884 | 0.5483 | 0.4347 | 0.5366 | -1.0 | 0.0591 | 0.4890 | 0.7344 | 0.5660 | 0.8228 | -1.0 | 0.4898 | 0.7344 |
7.0456 | 43.0 | 645 | 8.5564 | 0.4859 | 0.6750 | 0.5503 | 0.4107 | 0.5366 | -1.0 | 0.0610 | 0.4955 | 0.7383 | 0.5604 | 0.8317 | -1.0 | 0.4859 | 0.7383 |
6.9132 | 44.0 | 660 | 8.7082 | 0.4910 | 0.6897 | 0.5585 | 0.4342 | 0.5394 | -1.0 | 0.0532 | 0.4987 | 0.7357 | 0.5585 | 0.8287 | -1.0 | 0.4910 | 0.7357 |
7.1349 | 45.0 | 675 | 8.1748 | 0.4908 | 0.6962 | 0.5585 | 0.4296 | 0.5403 | -1.0 | 0.0526 | 0.4896 | 0.7364 | 0.5792 | 0.8188 | -1.0 | 0.4908 | 0.7364 |
6.9486 | 46.0 | 690 | 8.2447 | 0.4772 | 0.6904 | 0.5290 | 0.4158 | 0.5266 | -1.0 | 0.0597 | 0.4916 | 0.7338 | 0.5547 | 0.8277 | -1.0 | 0.4772 | 0.7338 |
7.1528 | 47.0 | 705 | 8.2414 | 0.4902 | 0.6889 | 0.5501 | 0.4370 | 0.5336 | -1.0 | 0.0526 | 0.4870 | 0.7344 | 0.5736 | 0.8188 | -1.0 | 0.4902 | 0.7344 |
7.0135 | 48.0 | 720 | 7.6569 | 0.4972 | 0.7006 | 0.5627 | 0.4546 | 0.5327 | -1.0 | 0.0604 | 0.4981 | 0.7468 | 0.5906 | 0.8287 | -1.0 | 0.4972 | 0.7468 |
6.6802 | 49.0 | 735 | 7.9771 | 0.4954 | 0.6861 | 0.5512 | 0.4406 | 0.5370 | -1.0 | 0.0539 | 0.5 | 0.7409 | 0.5736 | 0.8287 | -1.0 | 0.4954 | 0.7409 |
6.7759 | 50.0 | 750 | 8.1314 | 0.4892 | 0.6847 | 0.5666 | 0.4391 | 0.5228 | -1.0 | 0.0532 | 0.4909 | 0.7390 | 0.5736 | 0.8257 | -1.0 | 0.4892 | 0.7390 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.1.0.dev20230618
- Datasets 2.11.0
- Tokenizers 0.19.1
- Downloads last month
- 24
Model tree for fengchen025/mask2former-swin-base-coco-instance-cvppp-a1-ft
Base model
facebook/mask2former-swin-base-coco-instance