Edit model card

mit-b0-CMP_semantic_seg_with_mps_v2

This model is a fine-tuned version of nvidia/mit-b0.

It achieves the following results on the evaluation set:

  • Loss: 1.0863
  • Mean Iou: 0.4097
  • Mean Accuracy: 0.5538
  • Overall Accuracy: 0.6951
  • Per Category Iou:
    • Segment 0: 0.5921698801573617
    • Segment 1: 0.5795623712718901
    • Segment 2: 0.5784812820145221
    • Segment 3: 0.2917052691882505
    • Segment 4: 0.3792639848157326
    • Segment 5: 0.37973303153855376
    • Segment 6: 0.4481097636024487
    • Segment 7: 0.4354492668218124
    • Segment 8: 0.26472453634508664
    • Segment 9: 0.4173722023142026
    • Segment 10: 0.18166072949276144
    • Segment 11: 0.36809541729585366
  • Per Category Accuracy:
    • Segment 0: 0.6884460857323806
    • Segment 1: 0.7851625477616788
    • Segment 2: 0.7322992353412343
    • Segment 3: 0.45229387721112274
    • Segment 4: 0.5829333862769369
    • Segment 5: 0.5516333441001092
    • Segment 6: 0.5904157921999404
    • Segment 7: 0.5288772981353482
    • Segment 8: 0.4518224891972707
    • Segment 9: 0.571864661897264
    • Segment 10: 0.23178753217655862
    • Segment 11: 0.47833833709905393

Model description

For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Computer%20Vision/Image%20Segmentation/Trained%2C%20But%20to%20My%20Standard/Center%20for%20Machine%20Perception/Version%202/Center%20for%20Machine%20Perception%20-%20semantic_segmentation_v2.ipynb

Intended uses & limitations

This model is intended to demonstrate my ability to solve a complex problem using technology. You are welcome to use it, but remember that it is at your own risk/peril.

Training and evaluation data

Dataset Source: https://huggingface.co/datasets/Xpitfire/cmp_facade

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Overall Dataset Metrics

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy
1.6807 1.0 189 1.3310 0.2226 0.3388 0.5893
1.1837 2.0 378 1.1731 0.2602 0.3876 0.6122
1.0241 3.0 567 1.0485 0.2915 0.3954 0.6393
0.9353 4.0 756 0.9943 0.3054 0.4021 0.6570
0.8717 5.0 945 1.0010 0.3299 0.4440 0.6530
0.8238 6.0 1134 0.9537 0.3546 0.4771 0.6701
0.7415 8.0 1512 0.9738 0.3554 0.4634 0.6733
0.7708 7.0 1323 0.9789 0.3550 0.4837 0.6683
0.7018 9.0 1701 0.9449 0.3667 0.4802 0.6826
0.682 10.0 1890 0.9422 0.3762 0.5047 0.6805
0.6503 11.0 2079 0.9889 0.3785 0.5082 0.6729
0.633 12.0 2268 0.9594 0.3901 0.5224 0.6797
0.6035 13.0 2457 0.9612 0.3939 0.5288 0.6834
0.5874 14.0 2646 0.9657 0.3939 0.5383 0.6844
0.5684 15.0 2835 0.9762 0.3950 0.5446 0.6855
0.5485 16.0 3024 1.0645 0.3794 0.5095 0.6704
0.5402 17.0 3213 0.9747 0.4044 0.5600 0.6839
0.5275 18.0 3402 1.0054 0.3944 0.5411 0.6790
0.5032 19.0 3591 1.0014 0.3973 0.5256 0.6875
0.4985 20.0 3780 0.9893 0.3990 0.5468 0.6883
0.4925 21.0 3969 1.0416 0.3955 0.5339 0.6806
0.4772 22.0 4158 1.0142 0.3969 0.5476 0.6838
0.4707 23.0 4347 0.9896 0.4077 0.5458 0.6966
0.4601 24.0 4536 1.0040 0.4104 0.5551 0.6948
0.4544 25.0 4725 1.0093 0.4093 0.5652 0.6899
0.4421 26.0 4914 1.0434 0.4064 0.5448 0.6938
0.4293 27.0 5103 1.0391 0.4076 0.5571 0.6908
0.4312 28.0 5292 1.0037 0.4100 0.5534 0.6958
0.4309 29.0 5481 1.0288 0.4101 0.5493 0.6968
0.4146 30.0 5670 1.0602 0.4062 0.5445 0.6928
0.4106 31.0 5859 1.0573 0.4113 0.5520 0.6937
0.4102 32.0 6048 1.0616 0.4043 0.5444 0.6904
0.394 33.0 6237 1.0244 0.4104 0.5587 0.6957
0.3865 34.0 6426 1.0618 0.4086 0.5468 0.6922
0.3816 35.0 6615 1.0515 0.4109 0.5587 0.6937
0.3803 36.0 6804 1.0709 0.4118 0.5507 0.6982
0.3841 37.0 6993 1.0646 0.4102 0.5423 0.7000
0.383 38.0 7182 1.0769 0.4076 0.5463 0.6981
0.3831 39.0 7371 1.0821 0.4081 0.5438 0.6949
0.3701 40.0 7560 1.0971 0.4094 0.5503 0.6939
0.3728 41.0 7749 1.0850 0.4073 0.5426 0.6955
0.3693 42.0 7938 1.0969 0.4065 0.5503 0.6922
0.3627 43.0 8127 1.0932 0.4087 0.5497 0.6948
0.3707 44.0 8316 1.1095 0.4071 0.5449 0.6950
0.3715 45.0 8505 1.0884 0.4110 0.5481 0.6962
0.3637 46.0 8694 1.0893 0.4116 0.5565 0.6948
0.3581 47.0 8883 1.1164 0.4080 0.5443 0.6938
0.3595 48.0 9072 1.1264 0.4056 0.5374 0.6942
0.3604 49.0 9261 1.0948 0.4104 0.5508 0.6953
0.3541 50.0 9450 1.0863 0.4097 0.5538 0.6951

Per Category IoU For Each Segment

Epoch Segment 0 Segment 1 Segment 2 Segment 3 Segment 4 Segment 5 Segment 6 Segment 7 Segment 8 Segment 9 Segment 10 Segment 11
1.0 0.4635 0.4905 0.4698 0.0 0.2307 0.1515 0.2789 0.0002 0.0250 0.3527 0.0 0.2087
2.0 0.4240 0.5249 0.5152 0.0057 0.2636 0.2756 0.3312 0.0575 0.0539 0.3860 0.0 0.2854
3.0 0.5442 0.5037 0.5329 0.0412 0.3062 0.2714 0.3820 0.1430 0.0796 0.4007 0.0002 0.2929
4.0 0.5776 0.5289 0.5391 0.1171 0.3137 0.2600 0.3664 0.1527 0.1074 0.3935 0.0002 0.3078
5.0 0.4790 0.5506 0.5472 0.1547 0.3372 0.3297 0.4151 0.2339 0.1709 0.4081 0.0008 0.3314
6.0 0.5572 0.5525 0.5611 0.2076 0.3434 0.3163 0.4103 0.3279 0.2107 0.4191 0.0067 0.3418
7.0 0.5310 0.5634 0.5594 0.2299 0.3424 0.3375 0.4050 0.2883 0.2197 0.4142 0.0316 0.3373
8.0 0.5366 0.5659 0.5550 0.2331 0.3497 0.3334 0.4301 0.3401 0.1989 0.4181 0.0358 0.2680
9.0 0.5798 0.5657 0.5624 0.2368 0.3648 0.3271 0.4250 0.3207 0.2096 0.4236 0.0504 0.3346
10.0 0.5802 0.5622 0.5585 0.2340 0.3793 0.3407 0.4277 0.3801 0.2301 0.4216 0.0640 0.3367
11.0 0.5193 0.5649 0.5605 0.2698 0.3772 0.3526 0.4342 0.3433 0.2415 0.4336 0.0889 0.3562
12.0 0.5539 0.5641 0.5679 0.2658 0.3757 0.3510 0.4257 0.3993 0.2354 0.4338 0.1800 0.3287
13.0 0.5663 0.5666 0.5679 0.2631 0.3726 0.3609 0.4351 0.3759 0.2511 0.4256 0.1737 0.3681
14.0 0.5807 0.5670 0.5679 0.2670 0.3594 0.3605 0.4393 0.3863 0.2406 0.4228 0.1705 0.3652
15.0 0.5800 0.5711 0.5671 0.2825 0.3664 0.3587 0.4408 0.4021 0.2540 0.4246 0.1376 0.3548
16.0 0.4855 0.5683 0.5685 0.2612 0.3832 0.3628 0.4378 0.4056 0.2525 0.4206 0.1242 0.2825
17.0 0.5697 0.5674 0.5687 0.2971 0.3767 0.3741 0.4486 0.4126 0.2489 0.4260 0.1874 0.3757
18.0 0.5341 0.5728 0.5616 0.2827 0.3823 0.3782 0.4298 0.4070 0.2578 0.4195 0.1448 0.3632
19.0 0.5696 0.5739 0.5699 0.2918 0.3717 0.3635 0.4444 0.4122 0.2531 0.4142 0.1659 0.3369
20.0 0.5937 0.5702 0.5630 0.2892 0.3790 0.3757 0.4383 0.4110 0.2592 0.4147 0.1291 0.3653
21.0 0.5336 0.5723 0.5732 0.2843 0.3748 0.3738 0.4383 0.3876 0.2598 0.4170 0.1693 0.3624
22.0 0.5634 0.5752 0.5595 0.2783 0.3833 0.3540 0.4448 0.4054 0.2586 0.4145 0.1597 0.3660
23.0 0.6013 0.5801 0.5794 0.2988 0.3816 0.3736 0.4464 0.4241 0.2633 0.4162 0.1747 0.3530
24.0 0.6061 0.5756 0.5721 0.3086 0.3771 0.3707 0.4459 0.4242 0.2665 0.4104 0.1942 0.3732
25.0 0.5826 0.5745 0.5742 0.3109 0.3765 0.3784 0.4441 0.4184 0.2609 0.4219 0.1930 0.3765
26.0 0.5783 0.5821 0.5770 0.2985 0.3885 0.3582 0.4458 0.4220 0.2717 0.4260 0.1690 0.3600
27.0 0.5764 0.5777 0.5749 0.2868 0.3824 0.3857 0.4450 0.4170 0.2644 0.4295 0.1922 -
28.0 0.6023 0.5776 0.5769 0.2964 0.3759 0.3758 0.4464 0.4245 0.2712 0.4083 0.1967 0.3680
29.0 0.6043 0.5814 0.5728 0.2882 0.3867 0.3841 0.4369 0.4254 0.2659 0.4252 0.2106 0.3391
30.0 0.5840 0.5792 0.5750 0.2859 0.3839 0.3786 0.4479 0.4259 0.2664 0.3947 0.1753 0.3780
31.0 0.5819 0.5787 0.5775 0.2882 0.3861 0.3888 0.4522 0.4207 0.2722 0.4277 0.2050 0.3566
32.0 0.5769 0.5774 0.5737 0.2844 0.3762 0.3768 0.4424 0.4331 0.2649 0.3959 0.1748 0.3744
33.0 0.6076 0.5755 0.5774 0.2887 0.3833 0.3803 0.4483 0.4329 0.2687 0.4194 0.1884 0.3547
34.0 0.5729 0.5787 0.5789 0.2853 0.3854 0.3735 0.4469 0.4279 0.2694 0.4240 0.1986 0.3613
35.0 0.5942 0.5769 0.5777 0.2873 0.3867 0.3811 0.4448 0.4281 0.2669 0.4147 0.1956 0.3774
36.0 0.6024 0.5819 0.5782 0.2870 0.3850 0.3781 0.4469 0.4259 0.2696 0.4177 0.1885 0.3802
37.0 0.6099 0.5822 0.5787 0.2920 0.3827 0.3739 0.4416 0.4271 0.2646 0.4200 0.1864 0.3637
38.0 0.6028 0.5823 0.5799 0.2887 0.3828 0.3770 0.4470 0.4238 0.2639 0.4197 0.1617 0.3610
39.0 0.5856 0.5809 0.5772 0.2889 0.3772 0.3683 0.4493 0.4296 0.2665 0.4112 0.1902 0.3723
40.0 0.5830 0.5808 0.5785 0.2947 0.3803 0.3832 0.4496 0.4284 0.2675 0.4111 0.1913 0.3644
41.0 0.5853 0.5827 0.5786 0.2921 0.3809 0.3712 0.4464 0.4330 0.2670 0.4180 0.1631 0.3694
42.0 0.5756 0.5804 0.5766 0.2872 0.3775 0.3786 0.4480 0.4396 0.2669 0.4132 0.1619 0.3729
43.0 0.5872 0.5821 0.5762 0.2896 0.3820 0.3742 0.4499 0.4346 0.2685 0.4164 0.1848 0.3597
44.0 0.5894 0.5823 0.5774 0.2917 0.3801 0.3754 0.4476 0.4287 0.2635 0.4096 0.1911 0.3478
45.0 0.5912 0.5809 0.5791 0.2980 0.3817 0.3750 0.4483 0.4349 0.2677 0.4155 0.1909 0.3686
46.0 0.5922 0.5794 0.5788 0.2952 0.3804 0.3754 0.4487 0.4356 0.2641 0.4159 0.2068 0.3666
47.0 0.5748 0.5822 0.5779 0.2909 0.3849 0.3751 0.4487 0.4350 0.2687 0.4150 0.1785 0.3643
48.0 0.5787 0.5823 0.5789 0.2896 0.3819 0.3750 0.4479 0.4224 0.2665 0.4140 0.1723 0.3580
49.0 0.5878 0.5812 0.5782 0.2930 0.3807 0.3796 0.4482 0.4364 0.2659 0.4139 0.1915 0.3678
50.0 0.5922 0.5796 0.5785 0.2917 0.3793 0.3797 0.4481 0.4354 0.2647 0.4174 0.1817 0.3681

Per Category Accuracy For Each Segment

Epoch Segment 0 Segment 1 Segment 2 Segment 3 Segment 4 Segment 5 Segment 6 Segment 7 Segment 8 Segment 9 Segment 10 Segment 11
1.0 0.6133 0.6847 0.7408 0.0 0.4973 0.1720 0.4073 0.0002 0.0255 0.6371 0.0 0.2874
2.0 0.4782 0.7844 0.6966 0.0057 0.5735 0.3684 0.6226 0.0577 0.0563 0.5907 0.0 0.4168
3.0 0.8126 0.6852 0.6683 0.0420 0.4972 0.3418 0.5121 0.1453 0.0849 0.5882 0.0002 0.3672
4.0 0.8079 0.7362 0.6803 0.1231 0.5129 0.3324 0.4212 0.1554 0.1223 0.5587 0.0002 0.3751
5.0 0.5408 0.8111 0.7439 0.1647 0.5336 0.4720 0.5650 0.2459 0.2127 0.6032 0.0008 0.4343
6.0 0.6870 0.7532 0.7389 0.2428 0.5081 0.4173 0.5923 0.3710 0.3117 0.6181 0.0068 0.4785
7.0 0.6050 0.7961 0.7434 0.2876 0.5835 0.4949 0.5608 0.3103 0.3672 0.6185 0.0345 0.4022
8.0 0.6081 0.8461 0.6598 0.3035 0.5720 0.4540 0.5735 0.3849 0.2642 0.5608 0.0379 0.2962
9.0 0.7241 0.7684 0.7677 0.2958 0.5321 0.4212 0.5547 0.3513 0.2813 0.5645 0.0544 0.4465
10.0 0.7124 0.7649 0.7024 0.2879 0.5535 0.4413 0.6310 0.4960 0.3982 0.5592 0.0724 0.4370
11.0 0.5876 0.8060 0.7296 0.3838 0.5267 0.4983 0.5902 0.3838 0.4151 0.5987 0.1030 0.4756
12.0 0.6497 0.7807 0.7448 0.4018 0.5381 0.4615 0.5849 0.4883 0.3248 0.6063 0.2918 0.3958
13.0 0.6650 0.7792 0.7595 0.4049 0.5501 0.4940 0.5831 0.4375 0.3843 0.5591 0.2578 0.4711
14.0 0.6881 0.7715 0.7076 0.4518 0.6011 0.4900 0.6235 0.4466 0.3627 0.5934 0.2537 0.4702
15.0 0.6690 0.7721 0.7253 0.4607 0.6286 0.4900 0.5936 0.4951 0.4337 0.6295 0.1749 0.4630
16.0 0.5250 0.8335 0.7460 0.3742 0.6114 0.4823 0.5880 0.5021 0.4084 0.5757 0.1498 0.3171
17.0 0.6652 0.7673 0.7058 0.4318 0.5995 0.5137 0.6112 0.5596 0.4548 0.5819 0.2821 0.5465
18.0 0.6012 0.8091 0.6765 0.4561 0.5707 0.5393 0.6255 0.5679 0.4347 0.5567 0.1806 0.4751
19.0 0.6634 0.8079 0.6986 0.4389 0.5274 0.4876 0.6232 0.5022 0.3717 0.5244 0.2232 0.4388
20.0 0.7110 0.7679 0.6952 0.4875 0.5261 0.5549 0.6444 0.5301 0.4512 0.5441 0.1603 0.4888
21.0 0.5945 0.8130 0.7299 0.4511 0.5922 0.5324 0.5643 0.4341 0.4067 0.5834 0.2272 0.4781
22.0 0.6478 0.7921 0.6887 0.4826 0.5784 0.4599 0.6029 0.5938 0.4905 0.5605 0.2094 0.4644
23.0 0.7110 0.7878 0.7192 0.4629 0.5670 0.5061 0.5891 0.5354 0.4442 0.5585 0.2280 0.4401
24.0 0.7277 0.7718 0.7095 0.4789 0.5401 0.5080 0.6040 0.5314 0.4573 0.5414 0.2853 0.5062
25.0 0.6781 0.7703 0.7305 0.5102 0.5954 0.5311 0.5960 0.5286 0.4647 0.5861 0.2676 0.5242
26.0 0.6603 0.7989 0.7349 0.4689 0.5677 0.4620 0.6111 0.5258 0.4556 0.5889 0.2110 0.4530
27.0 - - - - - - - - - - - -
28.0 0.7218 0.7735 0.7273 0.4297 0.6001 0.5321 - - - - - -
29.0 0.7054 0.7948 0.7009 0.4552 0.5413 0.5357 0.5421 0.5250 0.4701 0.5949 0.3048 0.4213
30.0 0.6744 0.8004 0.7289 0.4421 0.5410 0.5409 0.5822 0.5334 0.4790 0.5028 0.2177 0.4910
31.0 0.6622 0.7858 0.7534 0.3855 0.5707 0.5889 0.5902 0.4979 0.4268 0.6260 0.2735 0.4630
32.0 0.6629 0.7960 0.7345 0.4132 0.5703 0.5450 0.5855 0.5469 0.4371 0.5087 0.2178 0.5147
33.0 0.7279 0.7642 0.7250 0.4999 0.5330 0.5418 0.6148 0.5491 0.4678 0.5808 0.2548 0.4455
34.0 0.6571 0.8002 0.7190 0.4516 0.5621 0.5183 0.5822 0.5444 0.3994 0.5931 0.2752 0.4588
35.0 0.6946 0.7771 0.7289 0.4481 0.5478 0.5396 0.5834 0.5407 0.4980 0.5652 0.2696 0.5116
36.0 0.7040 0.7881 0.7314 0.4432 0.5429 0.5308 0.5705 0.5124 0.4619 0.5667 0.2465 0.5101
37.0 0.7277 0.7884 0.7298 0.4325 0.5471 0.5196 0.5523 0.5073 0.4390 0.5614 0.2453 0.4575
38.0 0.7092 0.7907 0.7297 0.4713 0.5626 0.5483 0.5667 0.5067 0.4552 0.5608 0.2002 0.4545
39.0 0.6763 0.8000 0.7345 0.4678 0.5544 0.5005 0.5818 0.5236 0.4071 0.5436 0.2496 0.4865
40.0 0.6681 0.8020 0.7232 0.4519 0.5724 0.5465 0.5828 0.5132 0.4686 0.5479 0.2589 0.4678
41.0 0.6698 0.8022 0.7318 0.4297 0.5493 0.5160 0.5727 0.5289 0.4574 0.5711 0.1978 0.4842
42.0 0.6542 0.7977 0.7309 0.4450 0.5653 0.5389 0.5874 0.5625 0.4662 0.5561 0.1969 0.5024
43.0 0.6732 0.7995 0.7126 0.4343 0.5636 0.5217 0.5952 0.5608 0.4679 0.5672 0.2449 0.4559
44.0 0.6797 0.8035 0.7234 0.4571 0.5651 0.5352 0.5728 0.5156 0.4591 0.5458 0.2506 0.4307
45.0 0.6866 0.7923 0.7332 0.4349 0.5523 0.5312 0.5855 0.5314 0.4323 0.5653 0.2488 0.4833
46.0 0.6868 0.7856 0.7297 0.4426 0.5763 0.5288 0.5846 0.5331 0.4573 0.5724 0.2999 0.4811
47.0 0.6506 0.8100 0.7248 0.4534 0.5506 0.5230 0.5954 0.5515 0.4251 0.5546 0.2245 0.4677
48.0 0.6590 0.8106 0.7334 0.4353 0.5542 0.5254 0.5813 0.4869 0.4373 0.5611 0.2135 0.4503
49.0 0.6790 0.7967 0.7227 0.4477 0.5612 0.5523 0.5861 0.5460 0.4310 0.5518 0.2535 0.4817
50.0 0.6884 0.7852 0.7323 0.4523 0.5829 0.5516 0.5904 0.5289 0.4518 0.5719 0.2318 0.4783
  • All values in the above charts are rounded to nearest ten-thousandth.

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.12.1
  • Datasets 2.9.0
  • Tokenizers 0.12.1
Downloads last month
20
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train DunnBC22/mit-b0-CMP_semantic_seg_with_mps_v2