Mengkedalai commited on
Commit
a83ab7f
1 Parent(s): 04dbd40

End of training

Browse files
README.md CHANGED
@@ -1,18 +1,18 @@
1
  ---
 
2
  base_model: facebook/wav2vec2-large-xlsr-53
 
 
3
  datasets:
4
  - common_voice_17_0
5
- license: apache-2.0
6
  metrics:
7
  - wer
8
- tags:
9
- - generated_from_trainer
10
  model-index:
11
  - name: wav2vec2-large-xlsr-Mongolian-cv17-base
12
  results:
13
  - task:
14
- type: automatic-speech-recognition
15
  name: Automatic Speech Recognition
 
16
  dataset:
17
  name: common_voice_17_0
18
  type: common_voice_17_0
@@ -20,9 +20,9 @@ model-index:
20
  split: validation
21
  args: mn
22
  metrics:
23
- - type: wer
24
- value: 0.7902951968892054
25
- name: Wer
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the common_voice_17_0 dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.9632
36
- - Wer: 0.7903
37
 
38
  ## Model description
39
 
@@ -53,37 +53,155 @@ More information needed
53
 
54
  The following hyperparameters were used during training:
55
  - learning_rate: 0.0003
56
- - train_batch_size: 16
57
  - eval_batch_size: 8
58
  - seed: 42
59
- - gradient_accumulation_steps: 2
60
- - total_train_batch_size: 32
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_steps: 500
64
- - num_epochs: 20
65
  - mixed_precision_training: Native AMP
66
 
67
  ### Training results
68
 
69
  | Training Loss | Epoch | Step | Validation Loss | Wer |
70
  |:-------------:|:-------:|:----:|:---------------:|:------:|
71
- | No log | 1.1940 | 40 | 13.1486 | 1.0009 |
72
- | No log | 2.3881 | 80 | 7.6639 | 1.0 |
73
- | No log | 3.5821 | 120 | 3.4345 | 1.0 |
74
- | No log | 4.7761 | 160 | 3.1527 | 1.0 |
75
- | No log | 5.9701 | 200 | 3.1223 | 1.0 |
76
- | No log | 7.1642 | 240 | 3.1137 | 1.0 |
77
- | No log | 8.3582 | 280 | 3.1017 | 1.0 |
78
- | No log | 9.5522 | 320 | 3.0909 | 1.0 |
79
- | No log | 10.7463 | 360 | 3.0363 | 1.0 |
80
- | 5.1112 | 11.9403 | 400 | 2.8364 | 1.0 |
81
- | 5.1112 | 13.1343 | 440 | 2.0134 | 1.0078 |
82
- | 5.1112 | 14.3284 | 480 | 1.3866 | 1.0511 |
83
- | 5.1112 | 15.5224 | 520 | 1.1292 | 0.9320 |
84
- | 5.1112 | 16.7164 | 560 | 1.0117 | 0.9017 |
85
- | 5.1112 | 17.9104 | 600 | 0.9756 | 0.8339 |
86
- | 5.1112 | 19.1045 | 640 | 0.9632 | 0.7903 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
87
 
88
 
89
  ### Framework versions
 
1
  ---
2
+ license: apache-2.0
3
  base_model: facebook/wav2vec2-large-xlsr-53
4
+ tags:
5
+ - generated_from_trainer
6
  datasets:
7
  - common_voice_17_0
 
8
  metrics:
9
  - wer
 
 
10
  model-index:
11
  - name: wav2vec2-large-xlsr-Mongolian-cv17-base
12
  results:
13
  - task:
 
14
  name: Automatic Speech Recognition
15
+ type: automatic-speech-recognition
16
  dataset:
17
  name: common_voice_17_0
18
  type: common_voice_17_0
 
20
  split: validation
21
  args: mn
22
  metrics:
23
+ - name: Wer
24
+ type: wer
25
+ value: 0.6570458404074703
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the common_voice_17_0 dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 1.1550
36
+ - Wer: 0.6570
37
 
38
  ## Model description
39
 
 
53
 
54
  The following hyperparameters were used during training:
55
  - learning_rate: 0.0003
56
+ - train_batch_size: 32
57
  - eval_batch_size: 8
58
  - seed: 42
 
 
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
  - lr_scheduler_warmup_steps: 500
62
+ - num_epochs: 80
63
  - mixed_precision_training: Native AMP
64
 
65
  ### Training results
66
 
67
  | Training Loss | Epoch | Step | Validation Loss | Wer |
68
  |:-------------:|:-------:|:----:|:---------------:|:------:|
69
+ | No log | 0.5882 | 20 | 12.9741 | 1.0 |
70
+ | No log | 1.1765 | 40 | 12.7936 | 1.0002 |
71
+ | No log | 1.7647 | 60 | 12.3703 | 1.0 |
72
+ | No log | 2.3529 | 80 | 5.6880 | 1.0 |
73
+ | No log | 2.9412 | 100 | 3.6853 | 1.0 |
74
+ | No log | 3.5294 | 120 | 3.3076 | 1.0 |
75
+ | No log | 4.1176 | 140 | 3.2023 | 1.0 |
76
+ | No log | 4.7059 | 160 | 3.1422 | 1.0 |
77
+ | No log | 5.2941 | 180 | 3.1331 | 1.0 |
78
+ | No log | 5.8824 | 200 | 3.1183 | 1.0 |
79
+ | No log | 6.4706 | 220 | 3.1175 | 1.0 |
80
+ | No log | 7.0588 | 240 | 3.1132 | 1.0 |
81
+ | No log | 7.6471 | 260 | 3.1111 | 1.0 |
82
+ | No log | 8.2353 | 280 | 3.1101 | 1.0 |
83
+ | No log | 8.8235 | 300 | 3.1135 | 1.0 |
84
+ | No log | 9.4118 | 320 | 3.1039 | 1.0 |
85
+ | No log | 10.0 | 340 | 3.0961 | 1.0 |
86
+ | No log | 10.5882 | 360 | 3.0809 | 1.0 |
87
+ | No log | 11.1765 | 380 | 3.0651 | 1.0 |
88
+ | 4.9312 | 11.7647 | 400 | 3.0478 | 1.0 |
89
+ | 4.9312 | 12.3529 | 420 | 3.0584 | 1.0 |
90
+ | 4.9312 | 12.9412 | 440 | 3.0064 | 1.0 |
91
+ | 4.9312 | 13.5294 | 460 | 2.8224 | 1.0 |
92
+ | 4.9312 | 14.1176 | 480 | 2.5811 | 1.0 |
93
+ | 4.9312 | 14.7059 | 500 | 2.1769 | 1.0032 |
94
+ | 4.9312 | 15.2941 | 520 | 1.7646 | 1.0742 |
95
+ | 4.9312 | 15.8824 | 540 | 1.4124 | 1.0159 |
96
+ | 4.9312 | 16.4706 | 560 | 1.2848 | 0.9538 |
97
+ | 4.9312 | 17.0588 | 580 | 1.2267 | 0.9808 |
98
+ | 4.9312 | 17.6471 | 600 | 1.1108 | 0.9423 |
99
+ | 4.9312 | 18.2353 | 620 | 1.1815 | 0.9678 |
100
+ | 4.9312 | 18.8235 | 640 | 1.0553 | 0.8896 |
101
+ | 4.9312 | 19.4118 | 660 | 1.0977 | 0.8884 |
102
+ | 4.9312 | 20.0 | 680 | 0.9775 | 0.8532 |
103
+ | 4.9312 | 20.5882 | 700 | 0.9972 | 0.8340 |
104
+ | 4.9312 | 21.1765 | 720 | 1.0438 | 0.8009 |
105
+ | 4.9312 | 21.7647 | 740 | 0.9990 | 0.7850 |
106
+ | 4.9312 | 22.3529 | 760 | 0.9693 | 0.7595 |
107
+ | 4.9312 | 22.9412 | 780 | 1.0659 | 0.7699 |
108
+ | 1.3568 | 23.5294 | 800 | 0.9913 | 0.7610 |
109
+ | 1.3568 | 24.1176 | 820 | 1.0340 | 0.7547 |
110
+ | 1.3568 | 24.7059 | 840 | 1.0347 | 0.7337 |
111
+ | 1.3568 | 25.2941 | 860 | 1.0703 | 0.7437 |
112
+ | 1.3568 | 25.8824 | 880 | 1.0441 | 0.7350 |
113
+ | 1.3568 | 26.4706 | 900 | 1.0683 | 0.7261 |
114
+ | 1.3568 | 27.0588 | 920 | 1.0231 | 0.7296 |
115
+ | 1.3568 | 27.6471 | 940 | 1.0517 | 0.7291 |
116
+ | 1.3568 | 28.2353 | 960 | 1.1089 | 0.7417 |
117
+ | 1.3568 | 28.8235 | 980 | 1.0957 | 0.7223 |
118
+ | 1.3568 | 29.4118 | 1000 | 1.1120 | 0.7258 |
119
+ | 1.3568 | 30.0 | 1020 | 1.0992 | 0.7396 |
120
+ | 1.3568 | 30.5882 | 1040 | 1.1502 | 0.7190 |
121
+ | 1.3568 | 31.1765 | 1060 | 1.0743 | 0.7225 |
122
+ | 1.3568 | 31.7647 | 1080 | 1.0548 | 0.7178 |
123
+ | 1.3568 | 32.3529 | 1100 | 1.0534 | 0.7104 |
124
+ | 1.3568 | 32.9412 | 1120 | 1.0752 | 0.7083 |
125
+ | 1.3568 | 33.5294 | 1140 | 1.1574 | 0.7160 |
126
+ | 1.3568 | 34.1176 | 1160 | 1.1471 | 0.7190 |
127
+ | 1.3568 | 34.7059 | 1180 | 1.1077 | 0.7093 |
128
+ | 0.2559 | 35.2941 | 1200 | 1.0737 | 0.7004 |
129
+ | 0.2559 | 35.8824 | 1220 | 1.0822 | 0.6905 |
130
+ | 0.2559 | 36.4706 | 1240 | 1.0836 | 0.6889 |
131
+ | 0.2559 | 37.0588 | 1260 | 1.1399 | 0.6975 |
132
+ | 0.2559 | 37.6471 | 1280 | 1.0981 | 0.6880 |
133
+ | 0.2559 | 38.2353 | 1300 | 1.0887 | 0.6938 |
134
+ | 0.2559 | 38.8235 | 1320 | 1.0870 | 0.7112 |
135
+ | 0.2559 | 39.4118 | 1340 | 1.1324 | 0.6978 |
136
+ | 0.2559 | 40.0 | 1360 | 1.1170 | 0.6834 |
137
+ | 0.2559 | 40.5882 | 1380 | 1.1032 | 0.6761 |
138
+ | 0.2559 | 41.1765 | 1400 | 1.1361 | 0.7035 |
139
+ | 0.2559 | 41.7647 | 1420 | 1.0855 | 0.6965 |
140
+ | 0.2559 | 42.3529 | 1440 | 1.1320 | 0.6933 |
141
+ | 0.2559 | 42.9412 | 1460 | 1.1194 | 0.6849 |
142
+ | 0.2559 | 43.5294 | 1480 | 1.0870 | 0.6912 |
143
+ | 0.2559 | 44.1176 | 1500 | 1.1434 | 0.6785 |
144
+ | 0.2559 | 44.7059 | 1520 | 1.1434 | 0.6926 |
145
+ | 0.2559 | 45.2941 | 1540 | 1.1703 | 0.6839 |
146
+ | 0.2559 | 45.8824 | 1560 | 1.1275 | 0.6762 |
147
+ | 0.2559 | 46.4706 | 1580 | 1.1511 | 0.6840 |
148
+ | 0.1626 | 47.0588 | 1600 | 1.1336 | 0.6771 |
149
+ | 0.1626 | 47.6471 | 1620 | 1.1421 | 0.6785 |
150
+ | 0.1626 | 48.2353 | 1640 | 1.1084 | 0.6831 |
151
+ | 0.1626 | 48.8235 | 1660 | 1.1682 | 0.6831 |
152
+ | 0.1626 | 49.4118 | 1680 | 1.1349 | 0.6763 |
153
+ | 0.1626 | 50.0 | 1700 | 1.1561 | 0.6793 |
154
+ | 0.1626 | 50.5882 | 1720 | 1.1117 | 0.6660 |
155
+ | 0.1626 | 51.1765 | 1740 | 1.1875 | 0.6834 |
156
+ | 0.1626 | 51.7647 | 1760 | 1.1453 | 0.6782 |
157
+ | 0.1626 | 52.3529 | 1780 | 1.1040 | 0.6744 |
158
+ | 0.1626 | 52.9412 | 1800 | 1.1213 | 0.6711 |
159
+ | 0.1626 | 53.5294 | 1820 | 1.1454 | 0.6689 |
160
+ | 0.1626 | 54.1176 | 1840 | 1.1659 | 0.6706 |
161
+ | 0.1626 | 54.7059 | 1860 | 1.1616 | 0.6823 |
162
+ | 0.1626 | 55.2941 | 1880 | 1.2440 | 0.6817 |
163
+ | 0.1626 | 55.8824 | 1900 | 1.1472 | 0.6753 |
164
+ | 0.1626 | 56.4706 | 1920 | 1.1588 | 0.6691 |
165
+ | 0.1626 | 57.0588 | 1940 | 1.1590 | 0.6731 |
166
+ | 0.1626 | 57.6471 | 1960 | 1.1649 | 0.6712 |
167
+ | 0.1626 | 58.2353 | 1980 | 1.1990 | 0.6680 |
168
+ | 0.123 | 58.8235 | 2000 | 1.1282 | 0.6681 |
169
+ | 0.123 | 59.4118 | 2020 | 1.1609 | 0.6686 |
170
+ | 0.123 | 60.0 | 2040 | 1.1722 | 0.6703 |
171
+ | 0.123 | 60.5882 | 2060 | 1.1538 | 0.6739 |
172
+ | 0.123 | 61.1765 | 2080 | 1.1679 | 0.6727 |
173
+ | 0.123 | 61.7647 | 2100 | 1.1747 | 0.6687 |
174
+ | 0.123 | 62.3529 | 2120 | 1.1716 | 0.6691 |
175
+ | 0.123 | 62.9412 | 2140 | 1.1785 | 0.6655 |
176
+ | 0.123 | 63.5294 | 2160 | 1.1485 | 0.6658 |
177
+ | 0.123 | 64.1176 | 2180 | 1.1578 | 0.6626 |
178
+ | 0.123 | 64.7059 | 2200 | 1.1694 | 0.6648 |
179
+ | 0.123 | 65.2941 | 2220 | 1.1711 | 0.6677 |
180
+ | 0.123 | 65.8824 | 2240 | 1.1581 | 0.6624 |
181
+ | 0.123 | 66.4706 | 2260 | 1.1650 | 0.6723 |
182
+ | 0.123 | 67.0588 | 2280 | 1.1789 | 0.6637 |
183
+ | 0.123 | 67.6471 | 2300 | 1.1705 | 0.6624 |
184
+ | 0.123 | 68.2353 | 2320 | 1.1071 | 0.6615 |
185
+ | 0.123 | 68.8235 | 2340 | 1.1300 | 0.6654 |
186
+ | 0.123 | 69.4118 | 2360 | 1.1616 | 0.6672 |
187
+ | 0.123 | 70.0 | 2380 | 1.1671 | 0.6568 |
188
+ | 0.0991 | 70.5882 | 2400 | 1.1493 | 0.6587 |
189
+ | 0.0991 | 71.1765 | 2420 | 1.1476 | 0.6575 |
190
+ | 0.0991 | 71.7647 | 2440 | 1.1691 | 0.6582 |
191
+ | 0.0991 | 72.3529 | 2460 | 1.1867 | 0.6609 |
192
+ | 0.0991 | 72.9412 | 2480 | 1.1427 | 0.6519 |
193
+ | 0.0991 | 73.5294 | 2500 | 1.1635 | 0.6558 |
194
+ | 0.0991 | 74.1176 | 2520 | 1.1503 | 0.6553 |
195
+ | 0.0991 | 74.7059 | 2540 | 1.1487 | 0.6562 |
196
+ | 0.0991 | 75.2941 | 2560 | 1.1552 | 0.6576 |
197
+ | 0.0991 | 75.8824 | 2580 | 1.1638 | 0.6586 |
198
+ | 0.0991 | 76.4706 | 2600 | 1.1601 | 0.6566 |
199
+ | 0.0991 | 77.0588 | 2620 | 1.1603 | 0.6558 |
200
+ | 0.0991 | 77.6471 | 2640 | 1.1564 | 0.6547 |
201
+ | 0.0991 | 78.2353 | 2660 | 1.1560 | 0.6556 |
202
+ | 0.0991 | 78.8235 | 2680 | 1.1550 | 0.6564 |
203
+ | 0.0991 | 79.4118 | 2700 | 1.1550 | 0.6567 |
204
+ | 0.0991 | 80.0 | 2720 | 1.1550 | 0.6570 |
205
 
206
 
207
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0d5aad1a36c16b6f1420b05fcec8a498a84bc4899ca469e40aff790559355910
3
  size 1261971480
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c1cf71f92a4c41e49e7427f967b2bcdc9d50f1e17428802e5c5485387e6501c4
3
  size 1261971480
runs/Sep02_16-44-53_edictate-System-Product-Name/events.out.tfevents.1725288306.edictate-System-Product-Name.844875.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2938f2934e734f59e0871b75586fd46803809e8aecc984839463e26a32f940b3
3
- size 50746
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a838fd272b33f5545d85bbdee01f4926c6522f21ecf982862598d2072a35878e
3
+ size 51418