fydhfzh commited on
Commit
fe3d1ea
1 Parent(s): ad1f3fb

End of training

Browse files
README.md CHANGED
@@ -20,12 +20,12 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.5796
24
- - Accuracy: 0.8464
25
- - Precision: 0.8583
26
- - Recall: 0.8464
27
- - F1: 0.8439
28
- - Binary: 0.8937
29
 
30
  ## Model description
31
 
@@ -52,6 +52,7 @@ The following hyperparameters were used during training:
52
  - total_train_batch_size: 128
53
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
  - lr_scheduler_type: linear
 
55
  - num_epochs: 30
56
  - mixed_precision_training: Native AMP
57
 
@@ -59,46 +60,73 @@ The following hyperparameters were used during training:
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
61
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
62
- | No log | 0.13 | 50 | 3.8789 | 0.0553 | 0.0144 | 0.0553 | 0.0181 | 0.3219 |
63
- | No log | 0.27 | 100 | 3.4174 | 0.0904 | 0.0361 | 0.0904 | 0.0326 | 0.3583 |
64
- | No log | 0.4 | 150 | 3.1570 | 0.1350 | 0.0469 | 0.1350 | 0.0581 | 0.3893 |
65
- | No log | 0.54 | 200 | 2.9433 | 0.1930 | 0.0910 | 0.1930 | 0.1033 | 0.4305 |
66
- | No log | 0.67 | 250 | 2.6737 | 0.2132 | 0.1403 | 0.2132 | 0.1321 | 0.4498 |
67
- | No log | 0.81 | 300 | 2.4275 | 0.3266 | 0.2467 | 0.3266 | 0.2383 | 0.5259 |
68
- | No log | 0.94 | 350 | 2.1296 | 0.3941 | 0.3415 | 0.3941 | 0.3172 | 0.5744 |
69
- | 3.1825 | 1.08 | 400 | 2.0224 | 0.4642 | 0.4041 | 0.4642 | 0.3980 | 0.6205 |
70
- | 3.1825 | 1.21 | 450 | 1.7576 | 0.5304 | 0.4684 | 0.5304 | 0.4669 | 0.6719 |
71
- | 3.1825 | 1.35 | 500 | 1.5524 | 0.5830 | 0.5418 | 0.5830 | 0.5269 | 0.7086 |
72
- | 3.1825 | 1.48 | 550 | 1.4485 | 0.6248 | 0.6632 | 0.6248 | 0.5989 | 0.7383 |
73
- | 3.1825 | 1.62 | 600 | 1.3192 | 0.6383 | 0.6408 | 0.6383 | 0.5987 | 0.7476 |
74
- | 3.1825 | 1.75 | 650 | 1.1634 | 0.6964 | 0.7105 | 0.6964 | 0.6794 | 0.7877 |
75
- | 3.1825 | 1.89 | 700 | 1.1105 | 0.7301 | 0.7413 | 0.7301 | 0.7195 | 0.8093 |
76
- | 1.6698 | 2.02 | 750 | 1.0674 | 0.7314 | 0.7423 | 0.7314 | 0.7180 | 0.8115 |
77
- | 1.6698 | 2.16 | 800 | 1.0295 | 0.7314 | 0.7376 | 0.7314 | 0.7169 | 0.8128 |
78
- | 1.6698 | 2.29 | 850 | 0.9788 | 0.7355 | 0.7508 | 0.7355 | 0.7235 | 0.8148 |
79
- | 1.6698 | 2.43 | 900 | 0.9171 | 0.7503 | 0.7680 | 0.7503 | 0.7409 | 0.8250 |
80
- | 1.6698 | 2.56 | 950 | 0.7782 | 0.7908 | 0.7975 | 0.7908 | 0.7824 | 0.8533 |
81
- | 1.6698 | 2.7 | 1000 | 0.8368 | 0.7719 | 0.7872 | 0.7719 | 0.7624 | 0.8428 |
82
- | 1.6698 | 2.83 | 1050 | 0.8154 | 0.7692 | 0.7821 | 0.7692 | 0.7618 | 0.8393 |
83
- | 1.6698 | 2.96 | 1100 | 0.7900 | 0.7773 | 0.7894 | 0.7773 | 0.7686 | 0.8451 |
84
- | 1.0229 | 3.1 | 1150 | 0.7467 | 0.7949 | 0.8151 | 0.7949 | 0.7896 | 0.8582 |
85
- | 1.0229 | 3.23 | 1200 | 0.8610 | 0.7787 | 0.7916 | 0.7787 | 0.7693 | 0.8467 |
86
- | 1.0229 | 3.37 | 1250 | 0.7699 | 0.8057 | 0.8184 | 0.8057 | 0.8022 | 0.8655 |
87
- | 1.0229 | 3.5 | 1300 | 0.7546 | 0.8124 | 0.8338 | 0.8124 | 0.8108 | 0.8699 |
88
- | 1.0229 | 3.64 | 1350 | 0.7968 | 0.8003 | 0.8238 | 0.8003 | 0.7962 | 0.8626 |
89
- | 1.0229 | 3.77 | 1400 | 0.6990 | 0.8300 | 0.8471 | 0.8300 | 0.8271 | 0.8829 |
90
- | 1.0229 | 3.91 | 1450 | 0.7244 | 0.8219 | 0.8423 | 0.8219 | 0.8210 | 0.8750 |
91
- | 0.7454 | 4.04 | 1500 | 0.7213 | 0.8246 | 0.8426 | 0.8246 | 0.8219 | 0.8779 |
92
- | 0.7454 | 4.18 | 1550 | 0.8174 | 0.7922 | 0.8085 | 0.7922 | 0.7882 | 0.8549 |
93
- | 0.7454 | 4.31 | 1600 | 0.7212 | 0.8286 | 0.8475 | 0.8286 | 0.8243 | 0.8802 |
94
- | 0.7454 | 4.45 | 1650 | 0.6948 | 0.8327 | 0.8487 | 0.8327 | 0.8286 | 0.8830 |
95
- | 0.7454 | 4.58 | 1700 | 0.7873 | 0.8043 | 0.8237 | 0.8043 | 0.7998 | 0.8652 |
96
- | 0.7454 | 4.72 | 1750 | 0.7593 | 0.8124 | 0.8409 | 0.8124 | 0.8070 | 0.8709 |
97
- | 0.7454 | 4.85 | 1800 | 0.7766 | 0.8192 | 0.8362 | 0.8192 | 0.8154 | 0.8746 |
98
- | 0.7454 | 4.99 | 1850 | 0.7740 | 0.8205 | 0.8347 | 0.8205 | 0.8150 | 0.8767 |
99
- | 0.6044 | 5.12 | 1900 | 0.7932 | 0.8138 | 0.8279 | 0.8138 | 0.8071 | 0.8718 |
100
- | 0.6044 | 5.26 | 1950 | 0.8338 | 0.8205 | 0.8458 | 0.8205 | 0.8141 | 0.8756 |
101
- | 0.6044 | 5.39 | 2000 | 0.7471 | 0.8192 | 0.8341 | 0.8192 | 0.8123 | 0.8741 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
102
 
103
 
104
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.5994
24
+ - Accuracy: 0.8908
25
+ - Precision: 0.9054
26
+ - Recall: 0.8908
27
+ - F1: 0.8902
28
+ - Binary: 0.9252
29
 
30
  ## Model description
31
 
 
52
  - total_train_batch_size: 128
53
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
  - lr_scheduler_type: linear
55
+ - lr_scheduler_warmup_steps: 500
56
  - num_epochs: 30
57
  - mixed_precision_training: Native AMP
58
 
 
60
 
61
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
62
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
63
+ | No log | 0.13 | 50 | 4.4211 | 0.0256 | 0.0066 | 0.0256 | 0.0094 | 0.2032 |
64
+ | No log | 0.27 | 100 | 4.3448 | 0.0499 | 0.0239 | 0.0499 | 0.0181 | 0.2625 |
65
+ | No log | 0.4 | 150 | 3.9589 | 0.1120 | 0.0582 | 0.1120 | 0.0534 | 0.3738 |
66
+ | No log | 0.54 | 200 | 3.6032 | 0.1727 | 0.0912 | 0.1727 | 0.0968 | 0.4177 |
67
+ | No log | 0.67 | 250 | 3.2149 | 0.2321 | 0.1419 | 0.2321 | 0.1452 | 0.4592 |
68
+ | No log | 0.81 | 300 | 2.8786 | 0.3441 | 0.2742 | 0.3441 | 0.2581 | 0.5387 |
69
+ | No log | 0.94 | 350 | 2.5253 | 0.4211 | 0.3438 | 0.4211 | 0.3380 | 0.5941 |
70
+ | 3.7437 | 1.08 | 400 | 2.1778 | 0.4588 | 0.4083 | 0.4588 | 0.3896 | 0.6201 |
71
+ | 3.7437 | 1.21 | 450 | 1.8620 | 0.5709 | 0.5259 | 0.5709 | 0.5166 | 0.6992 |
72
+ | 3.7437 | 1.35 | 500 | 1.6172 | 0.5803 | 0.5498 | 0.5803 | 0.5168 | 0.7062 |
73
+ | 3.7437 | 1.48 | 550 | 1.3691 | 0.6640 | 0.6471 | 0.6640 | 0.6287 | 0.7633 |
74
+ | 3.7437 | 1.62 | 600 | 1.2425 | 0.6910 | 0.6704 | 0.6910 | 0.6541 | 0.7837 |
75
+ | 3.7437 | 1.75 | 650 | 1.1155 | 0.7193 | 0.7205 | 0.7193 | 0.6936 | 0.8038 |
76
+ | 3.7437 | 1.89 | 700 | 0.9569 | 0.7463 | 0.7599 | 0.7463 | 0.7287 | 0.8225 |
77
+ | 1.7895 | 2.02 | 750 | 0.9260 | 0.7584 | 0.7657 | 0.7584 | 0.7389 | 0.8321 |
78
+ | 1.7895 | 2.16 | 800 | 0.8667 | 0.7787 | 0.8008 | 0.7787 | 0.7639 | 0.8452 |
79
+ | 1.7895 | 2.29 | 850 | 0.7438 | 0.8138 | 0.8159 | 0.8138 | 0.8047 | 0.8696 |
80
+ | 1.7895 | 2.43 | 900 | 0.7958 | 0.8016 | 0.8175 | 0.8016 | 0.7917 | 0.8602 |
81
+ | 1.7895 | 2.56 | 950 | 0.6627 | 0.8327 | 0.8449 | 0.8327 | 0.8296 | 0.8829 |
82
+ | 1.7895 | 2.7 | 1000 | 0.7242 | 0.7976 | 0.8152 | 0.7976 | 0.7882 | 0.8592 |
83
+ | 1.7895 | 2.83 | 1050 | 0.6745 | 0.8165 | 0.8337 | 0.8165 | 0.8123 | 0.8719 |
84
+ | 1.7895 | 2.96 | 1100 | 0.6795 | 0.8192 | 0.8388 | 0.8192 | 0.8158 | 0.8761 |
85
+ | 1.0205 | 3.1 | 1150 | 0.6546 | 0.8354 | 0.8575 | 0.8354 | 0.8319 | 0.8835 |
86
+ | 1.0205 | 3.23 | 1200 | 0.6165 | 0.8394 | 0.8489 | 0.8394 | 0.8365 | 0.8868 |
87
+ | 1.0205 | 3.37 | 1250 | 0.7041 | 0.8232 | 0.8490 | 0.8232 | 0.8202 | 0.8775 |
88
+ | 1.0205 | 3.5 | 1300 | 0.5767 | 0.8516 | 0.8626 | 0.8516 | 0.8485 | 0.8957 |
89
+ | 1.0205 | 3.64 | 1350 | 0.5831 | 0.8448 | 0.8609 | 0.8448 | 0.8404 | 0.8910 |
90
+ | 1.0205 | 3.77 | 1400 | 0.5623 | 0.8650 | 0.8761 | 0.8650 | 0.8624 | 0.9051 |
91
+ | 1.0205 | 3.91 | 1450 | 0.5696 | 0.8650 | 0.8757 | 0.8650 | 0.8630 | 0.9047 |
92
+ | 0.7175 | 4.04 | 1500 | 0.5455 | 0.8543 | 0.8756 | 0.8543 | 0.8522 | 0.8981 |
93
+ | 0.7175 | 4.18 | 1550 | 0.5209 | 0.8650 | 0.8785 | 0.8650 | 0.8592 | 0.9053 |
94
+ | 0.7175 | 4.31 | 1600 | 0.6185 | 0.8435 | 0.8606 | 0.8435 | 0.8415 | 0.8908 |
95
+ | 0.7175 | 4.45 | 1650 | 0.5434 | 0.8677 | 0.8797 | 0.8677 | 0.8644 | 0.9066 |
96
+ | 0.7175 | 4.58 | 1700 | 0.6622 | 0.8489 | 0.8728 | 0.8489 | 0.8444 | 0.8945 |
97
+ | 0.7175 | 4.72 | 1750 | 0.5668 | 0.8677 | 0.8798 | 0.8677 | 0.8662 | 0.9070 |
98
+ | 0.7175 | 4.85 | 1800 | 0.5375 | 0.8812 | 0.8934 | 0.8812 | 0.8804 | 0.9179 |
99
+ | 0.7175 | 4.99 | 1850 | 0.5550 | 0.8677 | 0.8780 | 0.8677 | 0.8640 | 0.9080 |
100
+ | 0.5694 | 5.12 | 1900 | 0.5739 | 0.8691 | 0.8811 | 0.8691 | 0.8647 | 0.9089 |
101
+ | 0.5694 | 5.26 | 1950 | 0.5325 | 0.8826 | 0.8923 | 0.8826 | 0.8818 | 0.9174 |
102
+ | 0.5694 | 5.39 | 2000 | 0.5496 | 0.8772 | 0.8885 | 0.8772 | 0.8747 | 0.9147 |
103
+ | 0.5694 | 5.53 | 2050 | 0.6038 | 0.8745 | 0.8854 | 0.8745 | 0.8726 | 0.9123 |
104
+ | 0.5694 | 5.66 | 2100 | 0.5606 | 0.8826 | 0.8936 | 0.8826 | 0.8816 | 0.9194 |
105
+ | 0.5694 | 5.8 | 2150 | 0.5655 | 0.8745 | 0.8885 | 0.8745 | 0.8741 | 0.9128 |
106
+ | 0.5694 | 5.93 | 2200 | 0.5588 | 0.8785 | 0.8912 | 0.8785 | 0.8775 | 0.9157 |
107
+ | 0.4761 | 6.06 | 2250 | 0.6021 | 0.8637 | 0.8802 | 0.8637 | 0.8617 | 0.9047 |
108
+ | 0.4761 | 6.2 | 2300 | 0.5785 | 0.8839 | 0.8956 | 0.8839 | 0.8840 | 0.9194 |
109
+ | 0.4761 | 6.33 | 2350 | 0.6397 | 0.8691 | 0.8831 | 0.8691 | 0.8677 | 0.9090 |
110
+ | 0.4761 | 6.47 | 2400 | 0.5376 | 0.8880 | 0.8998 | 0.8880 | 0.8866 | 0.9238 |
111
+ | 0.4761 | 6.6 | 2450 | 0.5669 | 0.8920 | 0.9025 | 0.8920 | 0.8904 | 0.9255 |
112
+ | 0.4761 | 6.74 | 2500 | 0.6968 | 0.8543 | 0.8723 | 0.8543 | 0.8522 | 0.8987 |
113
+ | 0.4761 | 6.87 | 2550 | 0.5628 | 0.8839 | 0.8952 | 0.8839 | 0.8829 | 0.9194 |
114
+ | 0.4178 | 7.01 | 2600 | 0.5975 | 0.8772 | 0.8861 | 0.8772 | 0.8755 | 0.9167 |
115
+ | 0.4178 | 7.14 | 2650 | 0.5967 | 0.8853 | 0.8919 | 0.8853 | 0.8834 | 0.9219 |
116
+ | 0.4178 | 7.28 | 2700 | 0.6271 | 0.8799 | 0.8921 | 0.8799 | 0.8783 | 0.9166 |
117
+ | 0.4178 | 7.41 | 2750 | 0.6047 | 0.8799 | 0.8916 | 0.8799 | 0.8784 | 0.9170 |
118
+ | 0.4178 | 7.55 | 2800 | 0.5336 | 0.8853 | 0.8978 | 0.8853 | 0.8829 | 0.9204 |
119
+ | 0.4178 | 7.68 | 2850 | 0.5722 | 0.8988 | 0.9097 | 0.8988 | 0.8988 | 0.9298 |
120
+ | 0.4178 | 7.82 | 2900 | 0.5478 | 0.8866 | 0.8987 | 0.8866 | 0.8853 | 0.9213 |
121
+ | 0.4178 | 7.95 | 2950 | 0.5176 | 0.8907 | 0.9016 | 0.8907 | 0.8897 | 0.9242 |
122
+ | 0.3642 | 8.09 | 3000 | 0.5172 | 0.8947 | 0.9030 | 0.8947 | 0.8938 | 0.9279 |
123
+ | 0.3642 | 8.22 | 3050 | 0.6341 | 0.8799 | 0.8932 | 0.8799 | 0.8787 | 0.9157 |
124
+ | 0.3642 | 8.36 | 3100 | 0.6011 | 0.8812 | 0.8897 | 0.8812 | 0.8797 | 0.9181 |
125
+ | 0.3642 | 8.49 | 3150 | 0.5807 | 0.8745 | 0.8864 | 0.8745 | 0.8733 | 0.9132 |
126
+ | 0.3642 | 8.63 | 3200 | 0.5931 | 0.8799 | 0.8942 | 0.8799 | 0.8795 | 0.9157 |
127
+ | 0.3642 | 8.76 | 3250 | 0.6045 | 0.8812 | 0.8955 | 0.8812 | 0.8818 | 0.9175 |
128
+ | 0.3642 | 8.89 | 3300 | 0.5473 | 0.8934 | 0.9047 | 0.8934 | 0.8927 | 0.9260 |
129
+ | 0.3326 | 9.03 | 3350 | 0.5111 | 0.8934 | 0.9058 | 0.8934 | 0.8924 | 0.9266 |
130
 
131
 
132
  ### Framework versions
runs/Jul21_05-58-12_LAPTOP-1GID9RGH/events.out.tfevents.1721516294.LAPTOP-1GID9RGH.8660.8 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2ed59e7c4b8855809de70b5324c67068ee619049ae1a08f36e5408c9a56b16a8
3
- size 42068
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3ebcdbdd3077236bbb90273f59f22ba1fc2a8faf63b4fb7933cd65cca02894cc
3
+ size 46498
runs/Jul21_05-58-12_LAPTOP-1GID9RGH/events.out.tfevents.1721518308.LAPTOP-1GID9RGH.8660.9 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2846c5a8816befd580ada478b44f97cdff3261fca04525461b2afcd4762a5f2d
3
+ size 610