yijisuk commited on
Commit
e2428a7
1 Parent(s): 59b5dbb

End of training

Browse files
README.md CHANGED
@@ -17,14 +17,15 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the yijisuk/ic-chip-sample dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 0.1940
21
- - Mean Iou: 0.4787
22
- - Mean Accuracy: 0.9574
23
- - Overall Accuracy: 0.9574
24
  - Accuracy Unlabeled: nan
25
- - Accuracy Circuit: 0.9574
26
  - Iou Unlabeled: 0.0
27
- - Iou Circuit: 0.9574
 
28
 
29
  ## Model description
30
 
@@ -53,58 +54,48 @@ The following hyperparameters were used during training:
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit |
57
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|
58
- | 0.8077 | 1.0 | 20 | 0.5839 | 0.2821 | 0.5643 | 0.5643 | nan | 0.5643 | 0.0 | 0.5643 |
59
- | 0.684 | 2.0 | 40 | 0.4740 | 0.4665 | 0.9330 | 0.9330 | nan | 0.9330 | 0.0 | 0.9330 |
60
- | 0.3467 | 3.0 | 60 | 0.3114 | 0.4051 | 0.8102 | 0.8102 | nan | 0.8102 | 0.0 | 0.8102 |
61
- | 0.239 | 4.0 | 80 | 0.3108 | 0.4844 | 0.9687 | 0.9687 | nan | 0.9687 | 0.0 | 0.9687 |
62
- | 0.1678 | 5.0 | 100 | 0.2848 | 0.4748 | 0.9497 | 0.9497 | nan | 0.9497 | 0.0 | 0.9497 |
63
- | 0.9866 | 6.0 | 120 | 0.2794 | 0.4705 | 0.9409 | 0.9409 | nan | 0.9409 | 0.0 | 0.9409 |
64
- | 0.1223 | 7.0 | 140 | 0.2850 | 0.4179 | 0.8357 | 0.8357 | nan | 0.8357 | 0.0 | 0.8357 |
65
- | 0.3873 | 8.0 | 160 | 0.2800 | 0.4667 | 0.9335 | 0.9335 | nan | 0.9335 | 0.0 | 0.9335 |
66
- | 0.1514 | 9.0 | 180 | 0.2811 | 0.4535 | 0.9070 | 0.9070 | nan | 0.9070 | 0.0 | 0.9070 |
67
- | 0.3953 | 10.0 | 200 | 0.2771 | 0.4598 | 0.9196 | 0.9196 | nan | 0.9196 | 0.0 | 0.9196 |
68
- | 0.2172 | 11.0 | 220 | 0.2631 | 0.4474 | 0.8948 | 0.8948 | nan | 0.8948 | 0.0 | 0.8948 |
69
- | 0.1756 | 12.0 | 240 | 0.2316 | 0.4846 | 0.9691 | 0.9691 | nan | 0.9691 | 0.0 | 0.9691 |
70
- | 0.4101 | 13.0 | 260 | 0.2348 | 0.4817 | 0.9635 | 0.9635 | nan | 0.9635 | 0.0 | 0.9635 |
71
- | 0.2755 | 14.0 | 280 | 0.2662 | 0.4819 | 0.9639 | 0.9639 | nan | 0.9639 | 0.0 | 0.9639 |
72
- | 0.1457 | 15.0 | 300 | 0.2304 | 0.4678 | 0.9355 | 0.9355 | nan | 0.9355 | 0.0 | 0.9355 |
73
- | 0.1103 | 16.0 | 320 | 0.2102 | 0.4577 | 0.9153 | 0.9153 | nan | 0.9153 | 0.0 | 0.9153 |
74
- | 1.3035 | 17.0 | 340 | 0.2057 | 0.4694 | 0.9387 | 0.9387 | nan | 0.9387 | 0.0 | 0.9387 |
75
- | 0.2107 | 18.0 | 360 | 0.2025 | 0.4805 | 0.9610 | 0.9610 | nan | 0.9610 | 0.0 | 0.9610 |
76
- | 0.0762 | 19.0 | 380 | 0.2112 | 0.4870 | 0.9740 | 0.9740 | nan | 0.9740 | 0.0 | 0.9740 |
77
- | 1.0615 | 20.0 | 400 | 0.2027 | 0.4678 | 0.9355 | 0.9355 | nan | 0.9355 | 0.0 | 0.9355 |
78
- | 0.0695 | 21.0 | 420 | 0.2110 | 0.4716 | 0.9432 | 0.9432 | nan | 0.9432 | 0.0 | 0.9432 |
79
- | 0.182 | 22.0 | 440 | 0.2218 | 0.4834 | 0.9668 | 0.9668 | nan | 0.9668 | 0.0 | 0.9668 |
80
- | 0.1604 | 23.0 | 460 | 0.2026 | 0.4766 | 0.9532 | 0.9532 | nan | 0.9532 | 0.0 | 0.9532 |
81
- | 0.2332 | 24.0 | 480 | 0.2009 | 0.4763 | 0.9526 | 0.9526 | nan | 0.9526 | 0.0 | 0.9526 |
82
- | 1.0035 | 25.0 | 500 | 0.2065 | 0.4846 | 0.9692 | 0.9692 | nan | 0.9692 | 0.0 | 0.9692 |
83
- | 0.128 | 26.0 | 520 | 0.2007 | 0.4802 | 0.9603 | 0.9603 | nan | 0.9603 | 0.0 | 0.9603 |
84
- | 0.2214 | 27.0 | 540 | 0.1976 | 0.4818 | 0.9636 | 0.9636 | nan | 0.9636 | 0.0 | 0.9636 |
85
- | 0.1785 | 28.0 | 560 | 0.1963 | 0.4724 | 0.9447 | 0.9447 | nan | 0.9447 | 0.0 | 0.9447 |
86
- | 0.2044 | 29.0 | 580 | 0.2010 | 0.4831 | 0.9663 | 0.9663 | nan | 0.9663 | 0.0 | 0.9663 |
87
- | 0.1978 | 30.0 | 600 | 0.2000 | 0.4825 | 0.9650 | 0.9650 | nan | 0.9650 | 0.0 | 0.9650 |
88
- | 0.1721 | 31.0 | 620 | 0.1995 | 0.4827 | 0.9655 | 0.9655 | nan | 0.9655 | 0.0 | 0.9655 |
89
- | 0.1347 | 32.0 | 640 | 0.1930 | 0.4764 | 0.9528 | 0.9528 | nan | 0.9528 | 0.0 | 0.9528 |
90
- | 0.2605 | 33.0 | 660 | 0.1947 | 0.4787 | 0.9575 | 0.9575 | nan | 0.9575 | 0.0 | 0.9575 |
91
- | 1.2252 | 34.0 | 680 | 0.1956 | 0.4758 | 0.9516 | 0.9516 | nan | 0.9516 | 0.0 | 0.9516 |
92
- | 0.1057 | 35.0 | 700 | 0.1949 | 0.4827 | 0.9654 | 0.9654 | nan | 0.9654 | 0.0 | 0.9654 |
93
- | 0.0855 | 36.0 | 720 | 0.1964 | 0.4826 | 0.9652 | 0.9652 | nan | 0.9652 | 0.0 | 0.9652 |
94
- | 0.4844 | 37.0 | 740 | 0.1939 | 0.4712 | 0.9424 | 0.9424 | nan | 0.9424 | 0.0 | 0.9424 |
95
- | 0.1405 | 38.0 | 760 | 0.1926 | 0.4725 | 0.9450 | 0.9450 | nan | 0.9450 | 0.0 | 0.9450 |
96
- | 0.2812 | 39.0 | 780 | 0.1940 | 0.4792 | 0.9584 | 0.9584 | nan | 0.9584 | 0.0 | 0.9584 |
97
- | 0.977 | 40.0 | 800 | 0.1966 | 0.4766 | 0.9532 | 0.9532 | nan | 0.9532 | 0.0 | 0.9532 |
98
- | 0.1408 | 41.0 | 820 | 0.1964 | 0.4806 | 0.9612 | 0.9612 | nan | 0.9612 | 0.0 | 0.9612 |
99
- | 0.0365 | 42.0 | 840 | 0.1914 | 0.4731 | 0.9463 | 0.9463 | nan | 0.9463 | 0.0 | 0.9463 |
100
- | 0.0654 | 43.0 | 860 | 0.1934 | 0.4742 | 0.9484 | 0.9484 | nan | 0.9484 | 0.0 | 0.9484 |
101
- | 0.0635 | 44.0 | 880 | 0.1940 | 0.4791 | 0.9582 | 0.9582 | nan | 0.9582 | 0.0 | 0.9582 |
102
- | 0.0344 | 45.0 | 900 | 0.1921 | 0.4739 | 0.9478 | 0.9478 | nan | 0.9478 | 0.0 | 0.9478 |
103
- | 0.0973 | 46.0 | 920 | 0.1930 | 0.4773 | 0.9546 | 0.9546 | nan | 0.9546 | 0.0 | 0.9546 |
104
- | 0.1797 | 47.0 | 940 | 0.1928 | 0.4795 | 0.9590 | 0.9590 | nan | 0.9590 | 0.0 | 0.9590 |
105
- | 0.1562 | 48.0 | 960 | 0.1931 | 0.4802 | 0.9604 | 0.9604 | nan | 0.9604 | 0.0 | 0.9604 |
106
- | 0.0819 | 49.0 | 980 | 0.1934 | 0.4786 | 0.9572 | 0.9572 | nan | 0.9572 | 0.0 | 0.9572 |
107
- | 0.0518 | 50.0 | 1000 | 0.1940 | 0.4787 | 0.9574 | 0.9574 | nan | 0.9574 | 0.0 | 0.9574 |
108
 
109
 
110
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the yijisuk/ic-chip-sample dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.2252
21
+ - Mean Iou: 0.4213
22
+ - Mean Accuracy: 0.8427
23
+ - Overall Accuracy: 0.8427
24
  - Accuracy Unlabeled: nan
25
+ - Accuracy Circuit: 0.8427
26
  - Iou Unlabeled: 0.0
27
+ - Iou Circuit: 0.8427
28
+ - Dice Coefficient: 0.8060
29
 
30
  ## Model description
31
 
 
54
 
55
  ### Training results
56
 
57
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit | Dice Coefficient |
58
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|:----------------:|
59
+ | 0.564 | 1.25 | 100 | 0.3976 | 0.2158 | 0.4316 | 0.4316 | nan | 0.4316 | 0.0 | 0.4316 | 0.3032 |
60
+ | 0.523 | 2.5 | 200 | 0.3853 | 0.2051 | 0.4102 | 0.4102 | nan | 0.4102 | 0.0 | 0.4102 | 0.2797 |
61
+ | 0.5447 | 3.75 | 300 | 0.3570 | 0.1866 | 0.3731 | 0.3731 | nan | 0.3731 | 0.0 | 0.3731 | 0.2145 |
62
+ | 0.5087 | 5.0 | 400 | 0.3325 | 0.2632 | 0.5264 | 0.5264 | nan | 0.5264 | 0.0 | 0.5264 | 0.4352 |
63
+ | 0.5064 | 6.25 | 500 | 0.3596 | 0.3047 | 0.6094 | 0.6094 | nan | 0.6094 | 0.0 | 0.6094 | 0.5244 |
64
+ | 0.4947 | 7.5 | 600 | 0.3153 | 0.3062 | 0.6124 | 0.6124 | nan | 0.6124 | 0.0 | 0.6124 | 0.5797 |
65
+ | 0.4703 | 8.75 | 700 | 0.2752 | 0.4433 | 0.8866 | 0.8866 | nan | 0.8866 | 0.0 | 0.8866 | 0.8004 |
66
+ | 0.4679 | 10.0 | 800 | 0.2900 | 0.3833 | 0.7666 | 0.7666 | nan | 0.7666 | 0.0 | 0.7666 | 0.7333 |
67
+ | 0.4691 | 11.25 | 900 | 0.3102 | 0.4024 | 0.8048 | 0.8048 | nan | 0.8048 | 0.0 | 0.8048 | 0.7452 |
68
+ | 0.4648 | 12.5 | 1000 | 0.2768 | 0.3698 | 0.7396 | 0.7396 | nan | 0.7396 | 0.0 | 0.7396 | 0.7157 |
69
+ | 0.4459 | 13.75 | 1100 | 0.2575 | 0.4120 | 0.8239 | 0.8239 | nan | 0.8239 | 0.0 | 0.8239 | 0.7781 |
70
+ | 0.446 | 15.0 | 1200 | 0.2927 | 0.4653 | 0.9306 | 0.9306 | nan | 0.9306 | 0.0 | 0.9306 | 0.8262 |
71
+ | 0.4299 | 16.25 | 1300 | 0.2682 | 0.3375 | 0.6749 | 0.6749 | nan | 0.6749 | 0.0 | 0.6749 | 0.6881 |
72
+ | 0.4464 | 17.5 | 1400 | 0.2379 | 0.4282 | 0.8563 | 0.8563 | nan | 0.8563 | 0.0 | 0.8563 | 0.8051 |
73
+ | 0.4241 | 18.75 | 1500 | 0.2479 | 0.3996 | 0.7993 | 0.7993 | nan | 0.7993 | 0.0 | 0.7993 | 0.7770 |
74
+ | 0.4154 | 20.0 | 1600 | 0.2441 | 0.4133 | 0.8265 | 0.8265 | nan | 0.8265 | 0.0 | 0.8265 | 0.7923 |
75
+ | 0.428 | 21.25 | 1700 | 0.2505 | 0.4258 | 0.8515 | 0.8515 | nan | 0.8515 | 0.0 | 0.8515 | 0.8082 |
76
+ | 0.4126 | 22.5 | 1800 | 0.2419 | 0.4549 | 0.9097 | 0.9097 | nan | 0.9097 | 0.0 | 0.9097 | 0.8370 |
77
+ | 0.3986 | 23.75 | 1900 | 0.2364 | 0.3863 | 0.7726 | 0.7726 | nan | 0.7726 | 0.0 | 0.7726 | 0.7577 |
78
+ | 0.4053 | 25.0 | 2000 | 0.2419 | 0.3752 | 0.7504 | 0.7504 | nan | 0.7504 | 0.0 | 0.7504 | 0.7367 |
79
+ | 0.4018 | 26.25 | 2100 | 0.2310 | 0.4299 | 0.8598 | 0.8598 | nan | 0.8598 | 0.0 | 0.8598 | 0.8078 |
80
+ | 0.4048 | 27.5 | 2200 | 0.2292 | 0.4288 | 0.8577 | 0.8577 | nan | 0.8577 | 0.0 | 0.8577 | 0.8095 |
81
+ | 0.3838 | 28.75 | 2300 | 0.2294 | 0.4185 | 0.8371 | 0.8371 | nan | 0.8371 | 0.0 | 0.8371 | 0.7979 |
82
+ | 0.389 | 30.0 | 2400 | 0.2255 | 0.4337 | 0.8675 | 0.8675 | nan | 0.8675 | 0.0 | 0.8675 | 0.8181 |
83
+ | 0.3889 | 31.25 | 2500 | 0.2247 | 0.4307 | 0.8613 | 0.8613 | nan | 0.8613 | 0.0 | 0.8613 | 0.8180 |
84
+ | 0.4058 | 32.5 | 2600 | 0.2290 | 0.3806 | 0.7611 | 0.7611 | nan | 0.7611 | 0.0 | 0.7611 | 0.7493 |
85
+ | 0.3822 | 33.75 | 2700 | 0.2301 | 0.4023 | 0.8046 | 0.8046 | nan | 0.8046 | 0.0 | 0.8046 | 0.7794 |
86
+ | 0.3807 | 35.0 | 2800 | 0.2261 | 0.3952 | 0.7904 | 0.7904 | nan | 0.7904 | 0.0 | 0.7904 | 0.7691 |
87
+ | 0.3993 | 36.25 | 2900 | 0.2199 | 0.4163 | 0.8326 | 0.8326 | nan | 0.8326 | 0.0 | 0.8326 | 0.7997 |
88
+ | 0.3586 | 37.5 | 3000 | 0.2238 | 0.4098 | 0.8195 | 0.8195 | nan | 0.8195 | 0.0 | 0.8195 | 0.7897 |
89
+ | 0.3894 | 38.75 | 3100 | 0.2334 | 0.3539 | 0.7077 | 0.7077 | nan | 0.7077 | 0.0 | 0.7077 | 0.7093 |
90
+ | 0.3627 | 40.0 | 3200 | 0.2311 | 0.3646 | 0.7292 | 0.7292 | nan | 0.7292 | 0.0 | 0.7292 | 0.7249 |
91
+ | 0.3704 | 41.25 | 3300 | 0.2266 | 0.3876 | 0.7751 | 0.7751 | nan | 0.7751 | 0.0 | 0.7751 | 0.7621 |
92
+ | 0.3808 | 42.5 | 3400 | 0.2227 | 0.3996 | 0.7993 | 0.7993 | nan | 0.7993 | 0.0 | 0.7993 | 0.7793 |
93
+ | 0.3631 | 43.75 | 3500 | 0.2222 | 0.3910 | 0.7820 | 0.7820 | nan | 0.7820 | 0.0 | 0.7820 | 0.7655 |
94
+ | 0.367 | 45.0 | 3600 | 0.2253 | 0.4118 | 0.8237 | 0.8237 | nan | 0.8237 | 0.0 | 0.8237 | 0.7939 |
95
+ | 0.3609 | 46.25 | 3700 | 0.2225 | 0.4082 | 0.8165 | 0.8165 | nan | 0.8165 | 0.0 | 0.8165 | 0.7897 |
96
+ | 0.3515 | 47.5 | 3800 | 0.2226 | 0.4210 | 0.8420 | 0.8420 | nan | 0.8420 | 0.0 | 0.8420 | 0.8064 |
97
+ | 0.3888 | 48.75 | 3900 | 0.2283 | 0.3815 | 0.7630 | 0.7630 | nan | 0.7630 | 0.0 | 0.7630 | 0.7509 |
98
+ | 0.3503 | 50.0 | 4000 | 0.2252 | 0.4213 | 0.8427 | 0.8427 | nan | 0.8427 | 0.0 | 0.8427 | 0.8060 |
 
 
 
 
 
 
 
 
 
 
99
 
100
 
101
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:392f9f21cb116956c28d2b4f15a3f6d4dd2661dfc20ecd64182361dc5001ac3b
3
  size 14884776
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e983fd62f81c11facb080d4db37b737dc8d507778fef1089fa324359f9bd0d40
3
  size 14884776
runs/Jun24_20-00-51_Centauri/events.out.tfevents.1719226862.Centauri.28756.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:51fba25eeff37617debfc1b31c67c934538213fb8ff127ba6f2dc785f5148de0
3
+ size 40855
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6156365ce941a0c2f117b84455af1157f692341051ecf12f5e076a6c92a21170
3
  size 4271
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d501eefa41ee838fc210abd28bb078b5378eae044846c656caf297d973b98968
3
  size 4271