Hasano20's picture
Update README.md
bff055b verified
metadata
license: apache-2.0
base_model: microsoft/beit-base-finetuned-ade-640-640
tags:
  - generated_from_trainer
model-index:
  - name: BEiT_beit-base-finetuned-ade-640-640_Clean-Set3_RGB
    results: []
pipeline_tag: image-segmentation

BEiT_beit-base-finetuned-ade-640-640_Clean-Set3_RGB

This model is a fine-tuned version of microsoft/beit-base-finetuned-ade-640-640 on an unknown dataset. It achieves the following results on the evaluation set:

  • TrainLoss: 0.0216
  • Loss: 0.0336
  • Mean Iou: 0.9671
  • Mean Accuracy: 0.9806
  • Overall Accuracy: 0.9926
  • Accuracy Background: 0.9956
  • Accuracy Melt: 0.9505
  • Accuracy Substrate: 0.9957
  • Iou Background: 0.9916
  • Iou Melt: 0.9208
  • Iou Substrate: 0.9888

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 200
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Melt Accuracy Substrate Iou Background Iou Melt Iou Substrate
0.4042 0.9434 50 0.3272 0.8363 0.8672 0.9671 0.9931 0.6175 0.9911 0.9836 0.5790 0.9463
0.1649 1.8868 100 0.0973 0.9371 0.9572 0.9867 0.9959 0.8833 0.9926 0.9881 0.8437 0.9795
0.1439 2.8302 150 0.0724 0.9495 0.9800 0.9887 0.9946 0.9575 0.9879 0.9898 0.8770 0.9818
0.1275 3.7736 200 0.0656 0.9443 0.9778 0.9877 0.9969 0.9515 0.9850 0.9903 0.8627 0.9799
0.1522 4.7170 250 0.0585 0.9567 0.9737 0.9899 0.9971 0.9325 0.9915 0.9887 0.8976 0.9839
0.1292 5.6604 300 0.0594 0.9502 0.9748 0.9877 0.9934 0.9418 0.9890 0.9857 0.8850 0.9801
0.097 6.6038 350 0.0450 0.9634 0.9775 0.9912 0.9949 0.9432 0.9943 0.9883 0.9154 0.9866
0.1125 7.5472 400 0.0451 0.9605 0.9757 0.9905 0.9953 0.9384 0.9934 0.9877 0.9080 0.9857
0.102 8.4906 450 0.0518 0.9531 0.9798 0.9876 0.9921 0.9596 0.9876 0.9824 0.8960 0.9808
0.0878 9.4340 500 0.0411 0.9639 0.9820 0.9911 0.9947 0.9592 0.9922 0.9885 0.9172 0.9859
0.1198 10.3774 550 0.0679 0.9398 0.9655 0.9821 0.9873 0.9237 0.9855 0.9708 0.8768 0.9719
0.055 11.3208 600 0.0521 0.9518 0.9791 0.9867 0.9846 0.9610 0.9917 0.9780 0.8966 0.9810
0.086 12.2642 650 0.0402 0.9631 0.9791 0.9903 0.9920 0.9514 0.9940 0.9861 0.9185 0.9848
0.058 13.2075 700 0.0455 0.9590 0.9768 0.9892 0.9908 0.9463 0.9934 0.9837 0.9096 0.9836
0.0494 14.1509 750 0.0441 0.9588 0.9796 0.9895 0.9926 0.9547 0.9914 0.9842 0.9076 0.9846
0.0599 15.0943 800 0.0401 0.9622 0.9787 0.9904 0.9925 0.9496 0.9939 0.9865 0.9149 0.9851
0.0422 16.0377 850 0.0393 0.9619 0.9807 0.9906 0.9946 0.9556 0.9919 0.9880 0.9123 0.9853
0.0454 16.9811 900 0.0429 0.9579 0.9742 0.9897 0.9918 0.9360 0.9948 0.9857 0.9033 0.9846
0.0806 17.9245 950 0.0377 0.9640 0.9779 0.9915 0.9928 0.9445 0.9964 0.9892 0.9157 0.9869
0.0677 18.8679 1000 0.0380 0.9602 0.9797 0.9910 0.9941 0.9513 0.9937 0.9882 0.9047 0.9877
0.036 19.8113 1050 0.0388 0.9618 0.9799 0.9906 0.9942 0.9529 0.9925 0.9868 0.9127 0.9860
0.0424 20.7547 1100 0.0375 0.9601 0.9753 0.9905 0.9934 0.9376 0.9949 0.9868 0.9071 0.9863
0.0274 21.6981 1150 0.0322 0.9675 0.9795 0.9927 0.9955 0.9464 0.9965 0.9917 0.9218 0.9890
0.0622 22.6415 1200 0.0360 0.9648 0.9798 0.9913 0.9932 0.9512 0.9949 0.9881 0.9197 0.9868
0.0296 23.5849 1250 0.0334 0.9670 0.9823 0.9925 0.9953 0.9567 0.9950 0.9917 0.9207 0.9885
0.0222 24.5283 1300 0.0326 0.9674 0.9823 0.9925 0.9948 0.9569 0.9953 0.9912 0.9222 0.9887
0.0719 25.4717 1350 0.0328 0.9671 0.9832 0.9923 0.9945 0.9603 0.9947 0.9907 0.9223 0.9883
0.0197 26.4151 1400 0.0311 0.9681 0.9817 0.9929 0.9962 0.9537 0.9954 0.9922 0.9230 0.9893
0.0223 27.3585 1450 0.0324 0.9664 0.9811 0.9925 0.9956 0.9527 0.9950 0.9916 0.9191 0.9885
0.024 28.3019 1500 0.0340 0.9657 0.9808 0.9920 0.9950 0.9528 0.9947 0.9902 0.9190 0.9880
0.0242 29.2453 1550 0.0325 0.9672 0.9810 0.9926 0.9953 0.9522 0.9957 0.9915 0.9212 0.9888
0.0371 30.1887 1600 0.0315 0.9681 0.9826 0.9928 0.9957 0.9569 0.9952 0.9920 0.9232 0.9891
0.0235 31.1321 1650 0.0370 0.9632 0.9799 0.9911 0.9937 0.9520 0.9941 0.9880 0.9150 0.9868
0.0266 32.0755 1700 0.0335 0.9664 0.9811 0.9925 0.9951 0.9527 0.9954 0.9913 0.9193 0.9887
0.0216 33.0189 1750 0.0344 0.9656 0.9800 0.9921 0.9946 0.9497 0.9956 0.9904 0.9182 0.9883
0.0382 33.9623 1800 0.0319 0.9680 0.9819 0.9929 0.9954 0.9544 0.9959 0.9922 0.9224 0.9893
0.0161 34.9057 1850 0.0336 0.9672 0.9799 0.9927 0.9955 0.9479 0.9963 0.9920 0.9206 0.9890
0.0216 35.8491 1900 0.0336 0.9671 0.9806 0.9926 0.9956 0.9505 0.9957 0.9916 0.9208 0.9888

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.19.2
  • Tokenizers 0.19.1