hiba2 commited on
Commit
0da561b
1 Parent(s): 03ed96c

End of training

Browse files
Files changed (2) hide show
  1. README.md +196 -0
  2. preprocessor_config.json +1 -0
README.md ADDED
@@ -0,0 +1,196 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/wav2vec2-large-xlsr-53
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - fleurs
8
+ metrics:
9
+ - wer
10
+ model-index:
11
+ - name: wav2vec2_fleurs
12
+ results:
13
+ - task:
14
+ name: Automatic Speech Recognition
15
+ type: automatic-speech-recognition
16
+ dataset:
17
+ name: fleurs
18
+ type: fleurs
19
+ config: ar_eg
20
+ split: test
21
+ args: ar_eg
22
+ metrics:
23
+ - name: Wer
24
+ type: wer
25
+ value: 0.3367091772943236
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # wav2vec2_fleurs
32
+
33
+ This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the fleurs dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.4033
36
+ - Wer: 0.3367
37
+
38
+ ## Model description
39
+
40
+ More information needed
41
+
42
+ ## Intended uses & limitations
43
+
44
+ More information needed
45
+
46
+ ## Training and evaluation data
47
+
48
+ More information needed
49
+
50
+ ## Training procedure
51
+
52
+ ### Training hyperparameters
53
+
54
+ The following hyperparameters were used during training:
55
+ - learning_rate: 4e-05
56
+ - train_batch_size: 4
57
+ - eval_batch_size: 8
58
+ - seed: 42
59
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
+ - lr_scheduler_type: linear
61
+ - lr_scheduler_warmup_steps: 100
62
+ - num_epochs: 20
63
+ - mixed_precision_training: Native AMP
64
+
65
+ ### Training results
66
+
67
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
68
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
69
+ | 18.6221 | 0.17 | 100 | 10.1383 | 1.0 |
70
+ | 6.5078 | 0.33 | 200 | 4.0182 | 1.0 |
71
+ | 3.632 | 0.5 | 300 | 3.2678 | 1.0 |
72
+ | 3.2359 | 0.67 | 400 | 3.1984 | 1.0 |
73
+ | 3.2014 | 0.83 | 500 | 3.1752 | 1.0 |
74
+ | 3.1857 | 1.0 | 600 | 3.1671 | 1.0 |
75
+ | 3.1816 | 1.17 | 700 | 3.1657 | 1.0 |
76
+ | 3.1912 | 1.33 | 800 | 3.1570 | 1.0 |
77
+ | 3.186 | 1.5 | 900 | 3.1548 | 1.0 |
78
+ | 3.1554 | 1.67 | 1000 | 3.1478 | 1.0 |
79
+ | 3.1521 | 1.83 | 1100 | 3.1442 | 1.0 |
80
+ | 3.1584 | 2.0 | 1200 | 3.1369 | 1.0 |
81
+ | 3.1554 | 2.17 | 1300 | 3.1340 | 1.0 |
82
+ | 3.172 | 2.33 | 1400 | 3.1304 | 1.0 |
83
+ | 3.1479 | 2.5 | 1500 | 3.1303 | 1.0 |
84
+ | 3.1359 | 2.67 | 1600 | 3.0864 | 1.0 |
85
+ | 3.0757 | 2.83 | 1700 | 2.9191 | 1.0 |
86
+ | 2.8491 | 3.0 | 1800 | 2.5490 | 1.0 |
87
+ | 2.4969 | 3.17 | 1900 | 1.9998 | 0.9785 |
88
+ | 2.048 | 3.33 | 2000 | 1.5004 | 0.9297 |
89
+ | 1.7632 | 3.5 | 2100 | 1.2369 | 0.8613 |
90
+ | 1.5885 | 3.67 | 2200 | 1.0752 | 0.7953 |
91
+ | 1.3712 | 3.83 | 2300 | 0.9573 | 0.7519 |
92
+ | 1.2916 | 4.0 | 2400 | 0.9038 | 0.7089 |
93
+ | 1.2559 | 4.17 | 2500 | 0.8269 | 0.6853 |
94
+ | 1.1625 | 4.33 | 2600 | 0.7781 | 0.6539 |
95
+ | 1.1264 | 4.5 | 2700 | 0.7555 | 0.6337 |
96
+ | 1.032 | 4.67 | 2800 | 0.7215 | 0.6032 |
97
+ | 1.0592 | 4.83 | 2900 | 0.6883 | 0.5734 |
98
+ | 0.9682 | 5.0 | 3000 | 0.6657 | 0.5504 |
99
+ | 0.9851 | 5.17 | 3100 | 0.6518 | 0.5448 |
100
+ | 0.9515 | 5.33 | 3200 | 0.6382 | 0.5403 |
101
+ | 0.9009 | 5.5 | 3300 | 0.6226 | 0.5296 |
102
+ | 0.9048 | 5.67 | 3400 | 0.6123 | 0.5161 |
103
+ | 0.8882 | 5.83 | 3500 | 0.6047 | 0.5098 |
104
+ | 0.8749 | 6.0 | 3600 | 0.5909 | 0.5006 |
105
+ | 0.7939 | 6.17 | 3700 | 0.5804 | 0.4931 |
106
+ | 0.8363 | 6.33 | 3800 | 0.5744 | 0.4877 |
107
+ | 0.8605 | 6.5 | 3900 | 0.5776 | 0.4884 |
108
+ | 0.8358 | 6.67 | 4000 | 0.5497 | 0.4745 |
109
+ | 0.7744 | 6.83 | 4100 | 0.5549 | 0.4664 |
110
+ | 0.7867 | 7.0 | 4200 | 0.5429 | 0.4629 |
111
+ | 0.7166 | 7.17 | 4300 | 0.5306 | 0.4465 |
112
+ | 0.7347 | 7.33 | 4400 | 0.5363 | 0.4521 |
113
+ | 0.7173 | 7.5 | 4500 | 0.5289 | 0.4429 |
114
+ | 0.7653 | 7.67 | 4600 | 0.5240 | 0.4389 |
115
+ | 0.7388 | 7.83 | 4700 | 0.5062 | 0.4304 |
116
+ | 0.7326 | 8.0 | 4800 | 0.5073 | 0.4290 |
117
+ | 0.6622 | 8.17 | 4900 | 0.5049 | 0.4236 |
118
+ | 0.7495 | 8.33 | 5000 | 0.5094 | 0.4254 |
119
+ | 0.6898 | 8.5 | 5100 | 0.4874 | 0.4216 |
120
+ | 0.6664 | 8.67 | 5200 | 0.4948 | 0.4225 |
121
+ | 0.6783 | 8.83 | 5300 | 0.4879 | 0.4131 |
122
+ | 0.7205 | 9.0 | 5400 | 0.4751 | 0.4136 |
123
+ | 0.6182 | 9.17 | 5500 | 0.4795 | 0.4085 |
124
+ | 0.6895 | 9.33 | 5600 | 0.4730 | 0.4099 |
125
+ | 0.6503 | 9.5 | 5700 | 0.4713 | 0.4029 |
126
+ | 0.624 | 9.67 | 5800 | 0.4699 | 0.4024 |
127
+ | 0.6268 | 9.83 | 5900 | 0.4726 | 0.4069 |
128
+ | 0.6525 | 10.0 | 6000 | 0.4593 | 0.3953 |
129
+ | 0.6112 | 10.17 | 6100 | 0.4558 | 0.3922 |
130
+ | 0.657 | 10.33 | 6200 | 0.4621 | 0.3940 |
131
+ | 0.6445 | 10.5 | 6300 | 0.4579 | 0.3906 |
132
+ | 0.5869 | 10.67 | 6400 | 0.4548 | 0.3903 |
133
+ | 0.5855 | 10.83 | 6500 | 0.4433 | 0.3840 |
134
+ | 0.5538 | 11.0 | 6600 | 0.4514 | 0.3897 |
135
+ | 0.5599 | 11.17 | 6700 | 0.4403 | 0.3786 |
136
+ | 0.5691 | 11.33 | 6800 | 0.4411 | 0.3800 |
137
+ | 0.5731 | 11.5 | 6900 | 0.4396 | 0.3768 |
138
+ | 0.5707 | 11.67 | 7000 | 0.4492 | 0.3770 |
139
+ | 0.5504 | 11.83 | 7100 | 0.4391 | 0.3690 |
140
+ | 0.6058 | 12.0 | 7200 | 0.4344 | 0.3717 |
141
+ | 0.5676 | 12.17 | 7300 | 0.4354 | 0.3758 |
142
+ | 0.5684 | 12.33 | 7400 | 0.4351 | 0.3656 |
143
+ | 0.5404 | 12.5 | 7500 | 0.4324 | 0.3636 |
144
+ | 0.5504 | 12.67 | 7600 | 0.4313 | 0.3658 |
145
+ | 0.5596 | 12.83 | 7700 | 0.4268 | 0.3632 |
146
+ | 0.5246 | 13.0 | 7800 | 0.4316 | 0.3633 |
147
+ | 0.5441 | 13.17 | 7900 | 0.4233 | 0.3648 |
148
+ | 0.5318 | 13.33 | 8000 | 0.4260 | 0.3597 |
149
+ | 0.5116 | 13.5 | 8100 | 0.4279 | 0.3591 |
150
+ | 0.5299 | 13.67 | 8200 | 0.4233 | 0.3606 |
151
+ | 0.5519 | 13.83 | 8300 | 0.4166 | 0.3567 |
152
+ | 0.5452 | 14.0 | 8400 | 0.4233 | 0.3573 |
153
+ | 0.5111 | 14.17 | 8500 | 0.4203 | 0.3580 |
154
+ | 0.5365 | 14.33 | 8600 | 0.4163 | 0.3577 |
155
+ | 0.5023 | 14.5 | 8700 | 0.4135 | 0.3552 |
156
+ | 0.5189 | 14.67 | 8800 | 0.4133 | 0.3485 |
157
+ | 0.5492 | 14.83 | 8900 | 0.4133 | 0.3478 |
158
+ | 0.5128 | 15.0 | 9000 | 0.4114 | 0.3478 |
159
+ | 0.486 | 15.17 | 9100 | 0.4222 | 0.3472 |
160
+ | 0.5015 | 15.33 | 9200 | 0.4129 | 0.3515 |
161
+ | 0.4871 | 15.5 | 9300 | 0.4132 | 0.3430 |
162
+ | 0.5267 | 15.67 | 9400 | 0.4109 | 0.3481 |
163
+ | 0.4814 | 15.83 | 9500 | 0.4109 | 0.3461 |
164
+ | 0.4801 | 16.0 | 9600 | 0.4140 | 0.3453 |
165
+ | 0.4894 | 16.17 | 9700 | 0.4074 | 0.3433 |
166
+ | 0.4756 | 16.33 | 9800 | 0.4070 | 0.3410 |
167
+ | 0.4446 | 16.5 | 9900 | 0.4088 | 0.3412 |
168
+ | 0.4838 | 16.67 | 10000 | 0.4070 | 0.3407 |
169
+ | 0.5087 | 16.83 | 10100 | 0.4048 | 0.3422 |
170
+ | 0.4994 | 17.0 | 10200 | 0.4043 | 0.3442 |
171
+ | 0.5421 | 17.17 | 10300 | 0.4088 | 0.3483 |
172
+ | 0.489 | 17.33 | 10400 | 0.4097 | 0.3450 |
173
+ | 0.4618 | 17.5 | 10500 | 0.4077 | 0.3430 |
174
+ | 0.4734 | 17.67 | 10600 | 0.4028 | 0.3433 |
175
+ | 0.4882 | 17.83 | 10700 | 0.4040 | 0.3393 |
176
+ | 0.4804 | 18.0 | 10800 | 0.4045 | 0.3385 |
177
+ | 0.483 | 18.17 | 10900 | 0.4055 | 0.3366 |
178
+ | 0.4916 | 18.33 | 11000 | 0.4077 | 0.3375 |
179
+ | 0.4933 | 18.5 | 11100 | 0.4056 | 0.3365 |
180
+ | 0.4881 | 18.67 | 11200 | 0.4023 | 0.3375 |
181
+ | 0.4869 | 18.83 | 11300 | 0.4031 | 0.3378 |
182
+ | 0.4649 | 19.0 | 11400 | 0.4026 | 0.3382 |
183
+ | 0.4793 | 19.17 | 11500 | 0.4035 | 0.3376 |
184
+ | 0.5252 | 19.33 | 11600 | 0.4019 | 0.3375 |
185
+ | 0.4681 | 19.5 | 11700 | 0.4026 | 0.3382 |
186
+ | 0.4311 | 19.67 | 11800 | 0.4026 | 0.3368 |
187
+ | 0.4799 | 19.83 | 11900 | 0.4034 | 0.3372 |
188
+ | 0.4323 | 20.0 | 12000 | 0.4033 | 0.3367 |
189
+
190
+
191
+ ### Framework versions
192
+
193
+ - Transformers 4.36.0.dev0
194
+ - Pytorch 2.1.0+cu118
195
+ - Datasets 2.15.0
196
+ - Tokenizers 0.15.0
preprocessor_config.json CHANGED
@@ -4,6 +4,7 @@
4
  "feature_size": 1,
5
  "padding_side": "right",
6
  "padding_value": 0.0,
 
7
  "return_attention_mask": true,
8
  "sampling_rate": 16000
9
  }
 
4
  "feature_size": 1,
5
  "padding_side": "right",
6
  "padding_value": 0.0,
7
+ "processor_class": "Wav2Vec2Processor",
8
  "return_attention_mask": true,
9
  "sampling_rate": 16000
10
  }