metadata
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: XLS-R_Synthesis_LG_v1
results: []
XLS-R_Synthesis_LG_v1
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2280
- Wer: 0.1054
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 28
- eval_batch_size: 14
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- gradient_accumulation_steps: 2
- total_train_batch_size: 112
- total_eval_batch_size: 28
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
5.8288 | 1.0 | 738 | 3.2777 | 1.0 |
2.5551 | 2.0 | 1476 | 0.9521 | 0.8132 |
0.7636 | 3.0 | 2214 | 0.4213 | 0.4841 |
0.4823 | 4.0 | 2952 | 0.3358 | 0.3664 |
0.3808 | 5.0 | 3690 | 0.2608 | 0.3030 |
0.3183 | 6.0 | 4428 | 0.2175 | 0.2567 |
0.2771 | 7.0 | 5166 | 0.1981 | 0.2330 |
0.2475 | 8.0 | 5904 | 0.1878 | 0.2101 |
0.2267 | 9.0 | 6642 | 0.1812 | 0.2042 |
0.2091 | 10.0 | 7380 | 0.1721 | 0.1907 |
0.1938 | 11.0 | 8118 | 0.1691 | 0.1816 |
0.1776 | 12.0 | 8856 | 0.1597 | 0.1730 |
0.1679 | 13.0 | 9594 | 0.1558 | 0.1706 |
0.1575 | 14.0 | 10332 | 0.1610 | 0.1586 |
0.1498 | 15.0 | 11070 | 0.1532 | 0.1535 |
0.1421 | 16.0 | 11808 | 0.1544 | 0.1520 |
0.1356 | 17.0 | 12546 | 0.1488 | 0.1491 |
0.1287 | 18.0 | 13284 | 0.1542 | 0.1470 |
0.1239 | 19.0 | 14022 | 0.1484 | 0.1464 |
0.1198 | 20.0 | 14760 | 0.1523 | 0.1437 |
0.1154 | 21.0 | 15498 | 0.1540 | 0.1390 |
0.1103 | 22.0 | 16236 | 0.1484 | 0.1416 |
0.107 | 23.0 | 16974 | 0.1506 | 0.1421 |
0.1037 | 24.0 | 17712 | 0.1472 | 0.1386 |
0.0997 | 25.0 | 18450 | 0.1525 | 0.1382 |
0.0966 | 26.0 | 19188 | 0.1541 | 0.1400 |
0.095 | 27.0 | 19926 | 0.1595 | 0.1335 |
0.0914 | 28.0 | 20664 | 0.1650 | 0.1341 |
0.0893 | 29.0 | 21402 | 0.1631 | 0.1375 |
0.0855 | 30.0 | 22140 | 0.1700 | 0.1378 |
0.083 | 31.0 | 22878 | 0.1629 | 0.1320 |
0.0819 | 32.0 | 23616 | 0.1658 | 0.1316 |
0.079 | 33.0 | 24354 | 0.1674 | 0.1303 |
0.0769 | 34.0 | 25092 | 0.1749 | 0.1304 |
0.0755 | 35.0 | 25830 | 0.1755 | 0.1325 |
0.0744 | 36.0 | 26568 | 0.1703 | 0.1292 |
0.072 | 37.0 | 27306 | 0.1687 | 0.1298 |
0.0705 | 38.0 | 28044 | 0.1683 | 0.1298 |
0.069 | 39.0 | 28782 | 0.1712 | 0.1299 |
0.0668 | 40.0 | 29520 | 0.1798 | 0.1250 |
0.0654 | 41.0 | 30258 | 0.1811 | 0.1281 |
0.065 | 42.0 | 30996 | 0.1858 | 0.1274 |
0.0638 | 43.0 | 31734 | 0.1826 | 0.1250 |
0.0614 | 44.0 | 32472 | 0.1822 | 0.1292 |
0.0615 | 45.0 | 33210 | 0.1798 | 0.1278 |
0.0602 | 46.0 | 33948 | 0.1943 | 0.1237 |
0.0596 | 47.0 | 34686 | 0.1793 | 0.1267 |
0.0583 | 48.0 | 35424 | 0.1969 | 0.1261 |
0.0569 | 49.0 | 36162 | 0.1927 | 0.1215 |
0.0558 | 50.0 | 36900 | 0.1974 | 0.1238 |
0.0555 | 51.0 | 37638 | 0.1897 | 0.1231 |
0.0545 | 52.0 | 38376 | 0.1894 | 0.1242 |
0.0534 | 53.0 | 39114 | 0.1937 | 0.1221 |
0.0529 | 54.0 | 39852 | 0.1933 | 0.1203 |
0.0521 | 55.0 | 40590 | 0.1938 | 0.1221 |
0.0507 | 56.0 | 41328 | 0.1897 | 0.1210 |
0.0502 | 57.0 | 42066 | 0.2000 | 0.1196 |
0.0482 | 58.0 | 42804 | 0.1974 | 0.1185 |
0.0481 | 59.0 | 43542 | 0.2058 | 0.1163 |
0.0465 | 60.0 | 44280 | 0.1950 | 0.1162 |
0.0467 | 61.0 | 45018 | 0.2007 | 0.1170 |
0.0457 | 62.0 | 45756 | 0.1955 | 0.1174 |
0.0454 | 63.0 | 46494 | 0.2063 | 0.1171 |
0.0442 | 64.0 | 47232 | 0.1993 | 0.1170 |
0.0438 | 65.0 | 47970 | 0.2038 | 0.1159 |
0.0426 | 66.0 | 48708 | 0.2078 | 0.1156 |
0.0418 | 67.0 | 49446 | 0.2092 | 0.1161 |
0.0415 | 68.0 | 50184 | 0.2108 | 0.1131 |
0.0405 | 69.0 | 50922 | 0.2080 | 0.1132 |
0.0398 | 70.0 | 51660 | 0.2122 | 0.1127 |
0.0391 | 71.0 | 52398 | 0.2117 | 0.1146 |
0.0384 | 72.0 | 53136 | 0.2125 | 0.1130 |
0.038 | 73.0 | 53874 | 0.2129 | 0.1127 |
0.0372 | 74.0 | 54612 | 0.2150 | 0.1130 |
0.0365 | 75.0 | 55350 | 0.2110 | 0.1123 |
0.0364 | 76.0 | 56088 | 0.2121 | 0.1122 |
0.0359 | 77.0 | 56826 | 0.2118 | 0.1125 |
0.0354 | 78.0 | 57564 | 0.2146 | 0.1105 |
0.035 | 79.0 | 58302 | 0.2192 | 0.1113 |
0.0337 | 80.0 | 59040 | 0.2204 | 0.1103 |
0.0335 | 81.0 | 59778 | 0.2142 | 0.1095 |
0.0333 | 82.0 | 60516 | 0.2154 | 0.1096 |
0.0327 | 83.0 | 61254 | 0.2235 | 0.1082 |
0.0324 | 84.0 | 61992 | 0.2192 | 0.1087 |
0.0319 | 85.0 | 62730 | 0.2221 | 0.1075 |
0.0309 | 86.0 | 63468 | 0.2210 | 0.1083 |
0.0309 | 87.0 | 64206 | 0.2211 | 0.1080 |
0.0307 | 88.0 | 64944 | 0.2226 | 0.1075 |
0.0297 | 89.0 | 65682 | 0.2259 | 0.1065 |
0.0292 | 90.0 | 66420 | 0.2242 | 0.1070 |
0.0291 | 91.0 | 67158 | 0.2233 | 0.1066 |
0.0288 | 92.0 | 67896 | 0.2287 | 0.1067 |
0.0283 | 93.0 | 68634 | 0.2288 | 0.1068 |
0.0283 | 94.0 | 69372 | 0.2269 | 0.1061 |
0.0279 | 95.0 | 70110 | 0.2272 | 0.1065 |
0.0276 | 96.0 | 70848 | 0.2290 | 0.1064 |
0.0273 | 97.0 | 71586 | 0.2284 | 0.1061 |
0.0274 | 98.0 | 72324 | 0.2272 | 0.1059 |
0.0272 | 99.0 | 73062 | 0.2276 | 0.1056 |
0.0268 | 100.0 | 73800 | 0.2280 | 0.1054 |
Framework versions
- Transformers 4.36.1
- Pytorch 2.1.1+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0