--- license: other base_model: nvidia/mit-b5 tags: - image-segmentation - vision - generated_from_trainer model-index: - name: ecc_segformerv2 results: [] --- # ecc_segformerv2 This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the rishitunu/ecc_crackdetector_dataset dataset. It achieves the following results on the evaluation set: - Loss: 0.3478 - Mean Iou: 0.0862 - Mean Accuracy: 0.1924 - Overall Accuracy: 0.1924 - Accuracy Background: nan - Accuracy Crack: 0.1924 - Iou Background: 0.0 - Iou Crack: 0.1723 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Crack | Iou Background | Iou Crack | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------:|:--------------:|:---------:| | 0.1019 | 1.0 | 251 | 0.5116 | 0.1490 | 0.3280 | 0.3280 | nan | 0.3280 | 0.0 | 0.2979 | | 0.0938 | 2.0 | 502 | 0.4725 | 0.1144 | 0.2400 | 0.2400 | nan | 0.2400 | 0.0 | 0.2287 | | 0.098 | 3.0 | 753 | 0.5117 | 0.1276 | 0.2748 | 0.2748 | nan | 0.2748 | 0.0 | 0.2552 | | 0.1018 | 4.0 | 1004 | 0.3870 | 0.1053 | 0.2254 | 0.2254 | nan | 0.2254 | 0.0 | 0.2106 | | 0.0928 | 5.0 | 1255 | 0.2907 | 0.0772 | 0.1630 | 0.1630 | nan | 0.1630 | 0.0 | 0.1544 | | 0.0936 | 6.0 | 1506 | 0.5220 | 0.1193 | 0.2544 | 0.2544 | nan | 0.2544 | 0.0 | 0.2385 | | 0.077 | 7.0 | 1757 | 0.1608 | 0.0617 | 0.1308 | 0.1308 | nan | 0.1308 | 0.0 | 0.1235 | | 0.0963 | 8.0 | 2008 | 0.1756 | 0.0456 | 0.0923 | 0.0923 | nan | 0.0923 | 0.0 | 0.0912 | | 0.0958 | 9.0 | 2259 | 0.2027 | 0.0862 | 0.1813 | 0.1813 | nan | 0.1813 | 0.0 | 0.1725 | | 0.0755 | 10.0 | 2510 | 0.2327 | 0.0888 | 0.1832 | 0.1832 | nan | 0.1832 | 0.0 | 0.1776 | | 0.0632 | 11.0 | 2761 | 0.2169 | 0.0846 | 0.1863 | 0.1863 | nan | 0.1863 | 0.0 | 0.1693 | | 0.0638 | 12.0 | 3012 | 0.2309 | 0.0852 | 0.1957 | 0.1957 | nan | 0.1957 | 0.0 | 0.1704 | | 0.0509 | 13.0 | 3263 | 0.3209 | 0.1236 | 0.2910 | 0.2910 | nan | 0.2910 | 0.0 | 0.2472 | | 0.0497 | 14.0 | 3514 | 0.3274 | 0.1045 | 0.2354 | 0.2354 | nan | 0.2354 | 0.0 | 0.2089 | | 0.0396 | 15.0 | 3765 | 0.3415 | 0.1005 | 0.2257 | 0.2257 | nan | 0.2257 | 0.0 | 0.2010 | | 0.0373 | 16.0 | 4016 | 0.3530 | 0.1122 | 0.2486 | 0.2486 | nan | 0.2486 | 0.0 | 0.2244 | | 0.0388 | 17.0 | 4267 | 0.3312 | 0.0889 | 0.1974 | 0.1974 | nan | 0.1974 | 0.0 | 0.1778 | | 0.0346 | 18.0 | 4518 | 0.3061 | 0.0903 | 0.2125 | 0.2125 | nan | 0.2125 | 0.0 | 0.1807 | | 0.0296 | 19.0 | 4769 | 0.3223 | 0.1000 | 0.2315 | 0.2315 | nan | 0.2315 | 0.0 | 0.2000 | | 0.0311 | 20.0 | 5020 | 0.3458 | 0.0943 | 0.2237 | 0.2237 | nan | 0.2237 | 0.0 | 0.1887 | | 0.0303 | 21.0 | 5271 | 0.3283 | 0.0975 | 0.2255 | 0.2255 | nan | 0.2255 | 0.0 | 0.1951 | | 0.0249 | 22.0 | 5522 | 0.3387 | 0.0998 | 0.2327 | 0.2327 | nan | 0.2327 | 0.0 | 0.1996 | | 0.0298 | 23.0 | 5773 | 0.3332 | 0.0973 | 0.2242 | 0.2242 | nan | 0.2242 | 0.0 | 0.1946 | | 0.0239 | 24.0 | 6024 | 0.3778 | 0.1146 | 0.2634 | 0.2634 | nan | 0.2634 | 0.0 | 0.2292 | | 0.0238 | 25.0 | 6275 | 0.3250 | 0.0909 | 0.2081 | 0.2081 | nan | 0.2081 | 0.0 | 0.1818 | | 0.0242 | 26.0 | 6526 | 0.3826 | 0.1002 | 0.2285 | 0.2285 | nan | 0.2285 | 0.0 | 0.2004 | | 0.017 | 27.0 | 6777 | 0.3543 | 0.1058 | 0.2367 | 0.2367 | nan | 0.2367 | 0.0 | 0.2115 | | 0.0241 | 28.0 | 7028 | 0.3491 | 0.0915 | 0.2069 | 0.2069 | nan | 0.2069 | 0.0 | 0.1830 | | 0.0203 | 29.0 | 7279 | 0.3354 | 0.0899 | 0.2056 | 0.2056 | nan | 0.2056 | 0.0 | 0.1798 | | 0.0206 | 30.0 | 7530 | 0.3592 | 0.0944 | 0.2165 | 0.2165 | nan | 0.2165 | 0.0 | 0.1888 | | 0.0211 | 31.0 | 7781 | 0.3200 | 0.0943 | 0.2100 | 0.2100 | nan | 0.2100 | 0.0 | 0.1886 | | 0.0209 | 32.0 | 8032 | 0.3401 | 0.0850 | 0.1941 | 0.1941 | nan | 0.1941 | 0.0 | 0.1701 | | 0.0172 | 33.0 | 8283 | 0.3326 | 0.0879 | 0.1986 | 0.1986 | nan | 0.1986 | 0.0 | 0.1759 | | 0.0187 | 34.0 | 8534 | 0.3343 | 0.0869 | 0.1960 | 0.1960 | nan | 0.1960 | 0.0 | 0.1739 | | 0.0181 | 35.0 | 8785 | 0.3223 | 0.0824 | 0.1835 | 0.1835 | nan | 0.1835 | 0.0 | 0.1648 | | 0.0168 | 36.0 | 9036 | 0.3461 | 0.0864 | 0.1933 | 0.1933 | nan | 0.1933 | 0.0 | 0.1727 | | 0.0169 | 37.0 | 9287 | 0.3438 | 0.0848 | 0.1888 | 0.1888 | nan | 0.1888 | 0.0 | 0.1695 | | 0.0182 | 38.0 | 9538 | 0.3506 | 0.0865 | 0.1933 | 0.1933 | nan | 0.1933 | 0.0 | 0.1730 | | 0.0167 | 39.0 | 9789 | 0.3535 | 0.0869 | 0.1946 | 0.1946 | nan | 0.1946 | 0.0 | 0.1739 | | 0.0174 | 39.84 | 10000 | 0.3478 | 0.0862 | 0.1924 | 0.1924 | nan | 0.1924 | 0.0 | 0.1723 | ### Framework versions - Transformers 4.32.0.dev0 - Pytorch 2.0.1+cpu - Datasets 2.14.4 - Tokenizers 0.13.3