--- license: other base_model: nvidia/mit-b5 tags: - vision - image-segmentation - generated_from_trainer model-index: - name: segformer-b5-miic-tl results: [] --- # segformer-b5-miic-tl This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the yijisuk/ic-chip-sample dataset. It achieves the following results on the evaluation set: - Loss: 0.2247 - Mean Iou: 0.4565 - Mean Accuracy: 0.9129 - Overall Accuracy: 0.9129 - Accuracy Unlabeled: nan - Accuracy Circuit: 0.9129 - Iou Unlabeled: 0.0 - Iou Circuit: 0.9129 - Dice Coefficient: 0.8406 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit | Dice Coefficient | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|:----------------:| | 0.2801 | 3.12 | 250 | 0.2305 | 0.4832 | 0.9663 | 0.9663 | nan | 0.9663 | 0.0 | 0.9663 | 0.8527 | | 0.2785 | 6.25 | 500 | 0.2715 | 0.4800 | 0.9601 | 0.9601 | nan | 0.9601 | 0.0 | 0.9601 | 0.8511 | | 0.208 | 9.38 | 750 | 0.2681 | 0.4811 | 0.9622 | 0.9622 | nan | 0.9622 | 0.0 | 0.9622 | 0.8538 | | 0.2042 | 12.5 | 1000 | 0.2959 | 0.4650 | 0.9299 | 0.9299 | nan | 0.9299 | 0.0 | 0.9299 | 0.7879 | | 0.1649 | 15.62 | 1250 | 0.2407 | 0.4340 | 0.8679 | 0.8679 | nan | 0.8679 | 0.0 | 0.8679 | 0.8150 | | 0.1353 | 18.75 | 1500 | 0.2530 | 0.4543 | 0.9085 | 0.9085 | nan | 0.9085 | 0.0 | 0.9085 | 0.8336 | | 0.126 | 21.88 | 1750 | 0.4934 | 0.4559 | 0.9119 | 0.9119 | nan | 0.9119 | 0.0 | 0.9119 | 0.7678 | | 0.1196 | 25.0 | 2000 | 0.2896 | 0.4604 | 0.9209 | 0.9209 | nan | 0.9209 | 0.0 | 0.9209 | 0.7807 | | 0.1149 | 28.12 | 2250 | 0.2210 | 0.4634 | 0.9268 | 0.9268 | nan | 0.9268 | 0.0 | 0.9268 | 0.8470 | | 0.1095 | 31.25 | 2500 | 0.2215 | 0.4534 | 0.9067 | 0.9067 | nan | 0.9067 | 0.0 | 0.9067 | 0.8380 | | 0.109 | 34.38 | 2750 | 0.2256 | 0.4243 | 0.8487 | 0.8487 | nan | 0.8487 | 0.0 | 0.8487 | 0.8077 | | 0.1062 | 37.5 | 3000 | 0.2172 | 0.4497 | 0.8994 | 0.8994 | nan | 0.8994 | 0.0 | 0.8994 | 0.8363 | | 0.1046 | 40.62 | 3250 | 0.2401 | 0.4551 | 0.9102 | 0.9102 | nan | 0.9102 | 0.0 | 0.9102 | 0.8387 | | 0.1096 | 43.75 | 3500 | 0.2157 | 0.4582 | 0.9164 | 0.9164 | nan | 0.9164 | 0.0 | 0.9164 | 0.8425 | | 0.1014 | 46.88 | 3750 | 0.2344 | 0.4573 | 0.9146 | 0.9146 | nan | 0.9146 | 0.0 | 0.9146 | 0.8411 | | 0.1036 | 50.0 | 4000 | 0.2247 | 0.4565 | 0.9129 | 0.9129 | nan | 0.9129 | 0.0 | 0.9129 | 0.8406 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu115 - Datasets 2.15.0 - Tokenizers 0.15.0