metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-200hrs-v1
results: []
wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-200hrs-v1
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6946
- Wer: 0.1373
- Cer: 0.0454
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
1.6189 | 1.0 | 8064 | 0.5376 | 0.4012 | 0.1324 |
0.7423 | 2.0 | 16128 | 0.4487 | 0.2992 | 0.0890 |
0.6218 | 3.0 | 24192 | 0.4022 | 0.2694 | 0.0811 |
0.5548 | 4.0 | 32256 | 0.3769 | 0.2610 | 0.0793 |
0.5081 | 5.0 | 40320 | 0.3396 | 0.2463 | 0.0758 |
0.4712 | 6.0 | 48384 | 0.3460 | 0.2382 | 0.0744 |
0.4421 | 7.0 | 56448 | 0.3419 | 0.2297 | 0.0711 |
0.418 | 8.0 | 64512 | 0.3165 | 0.2230 | 0.0686 |
0.3977 | 9.0 | 72576 | 0.3380 | 0.2389 | 0.0843 |
0.3748 | 10.0 | 80640 | 0.3046 | 0.2174 | 0.0691 |
0.3587 | 11.0 | 88704 | 0.3301 | 0.2114 | 0.0663 |
0.3423 | 12.0 | 96768 | 0.3172 | 0.2048 | 0.0629 |
0.3282 | 13.0 | 104832 | 0.3118 | 0.2056 | 0.0672 |
0.3132 | 14.0 | 112896 | 0.3213 | 0.1971 | 0.0607 |
0.3009 | 15.0 | 120960 | 0.3292 | 0.1926 | 0.0619 |
0.2887 | 16.0 | 129024 | 0.2949 | 0.1963 | 0.0603 |
0.2778 | 17.0 | 137088 | 0.3066 | 0.1884 | 0.0590 |
0.2649 | 18.0 | 145152 | 0.3105 | 0.1883 | 0.0589 |
0.254 | 19.0 | 153216 | 0.3161 | 0.1851 | 0.0583 |
0.2444 | 20.0 | 161280 | 0.3183 | 0.1812 | 0.0567 |
0.2364 | 21.0 | 169344 | 0.3306 | 0.1803 | 0.0579 |
0.2281 | 22.0 | 177408 | 0.3106 | 0.1818 | 0.0573 |
0.219 | 23.0 | 185472 | 0.3367 | 0.1826 | 0.0573 |
0.2102 | 24.0 | 193536 | 0.3485 | 0.1747 | 0.0551 |
0.2027 | 25.0 | 201600 | 0.3619 | 0.1765 | 0.0558 |
0.1962 | 26.0 | 209664 | 0.3609 | 0.1729 | 0.0547 |
0.1907 | 27.0 | 217728 | 0.3344 | 0.1754 | 0.0554 |
0.1839 | 28.0 | 225792 | 0.3001 | 0.1770 | 0.0565 |
0.1784 | 29.0 | 233856 | 0.3524 | 0.1696 | 0.0550 |
0.1727 | 30.0 | 241920 | 0.3270 | 0.1767 | 0.0558 |
0.1653 | 31.0 | 249984 | 0.3251 | 0.1732 | 0.0554 |
0.161 | 32.0 | 258048 | 0.3885 | 0.1672 | 0.0539 |
0.1575 | 33.0 | 266112 | 0.3292 | 0.1675 | 0.0536 |
0.1532 | 34.0 | 274176 | 0.3686 | 0.1682 | 0.0541 |
0.1483 | 35.0 | 282240 | 0.3920 | 0.1641 | 0.0531 |
0.1449 | 36.0 | 290304 | 0.4157 | 0.1626 | 0.0527 |
0.1411 | 37.0 | 298368 | 0.3790 | 0.1706 | 0.0544 |
0.1366 | 38.0 | 306432 | 0.3723 | 0.1690 | 0.0546 |
0.1342 | 39.0 | 314496 | 0.3982 | 0.1645 | 0.0528 |
0.1311 | 40.0 | 322560 | 0.4210 | 0.1623 | 0.0522 |
0.127 | 41.0 | 330624 | 0.3935 | 0.1688 | 0.0541 |
0.1235 | 42.0 | 338688 | 0.3883 | 0.1603 | 0.0516 |
0.1215 | 43.0 | 346752 | 0.4329 | 0.1631 | 0.0521 |
0.1179 | 44.0 | 354816 | 0.3834 | 0.1691 | 0.0554 |
0.1145 | 45.0 | 362880 | 0.3790 | 0.1639 | 0.0523 |
0.1121 | 46.0 | 370944 | 0.4199 | 0.1618 | 0.0520 |
0.1103 | 47.0 | 379008 | 0.4275 | 0.1605 | 0.0516 |
0.1067 | 48.0 | 387072 | 0.4024 | 0.1605 | 0.0520 |
0.1061 | 49.0 | 395136 | 0.4334 | 0.1569 | 0.0516 |
0.1023 | 50.0 | 403200 | 0.4152 | 0.1566 | 0.0509 |
0.0999 | 51.0 | 411264 | 0.4638 | 0.1571 | 0.0509 |
0.0988 | 52.0 | 419328 | 0.4478 | 0.1577 | 0.0512 |
0.0956 | 53.0 | 427392 | 0.4565 | 0.1558 | 0.0505 |
0.0935 | 54.0 | 435456 | 0.4681 | 0.1595 | 0.0514 |
0.0911 | 55.0 | 443520 | 0.4740 | 0.1558 | 0.0503 |
0.0894 | 56.0 | 451584 | 0.4746 | 0.1535 | 0.0500 |
0.0881 | 57.0 | 459648 | 0.4513 | 0.1550 | 0.0503 |
0.0861 | 58.0 | 467712 | 0.5096 | 0.1538 | 0.0498 |
0.0831 | 59.0 | 475776 | 0.4405 | 0.1561 | 0.0509 |
0.0828 | 60.0 | 483840 | 0.4725 | 0.1507 | 0.0492 |
0.0811 | 61.0 | 491904 | 0.4770 | 0.1527 | 0.0500 |
0.079 | 62.0 | 499968 | 0.5079 | 0.1511 | 0.0495 |
0.0767 | 63.0 | 508032 | 0.4888 | 0.1511 | 0.0494 |
0.0752 | 64.0 | 516096 | 0.4707 | 0.1522 | 0.0494 |
0.0737 | 65.0 | 524160 | 0.4891 | 0.1537 | 0.0497 |
0.0721 | 66.0 | 532224 | 0.5608 | 0.1488 | 0.0488 |
0.0701 | 67.0 | 540288 | 0.5018 | 0.1527 | 0.0497 |
0.0685 | 68.0 | 548352 | 0.5504 | 0.1476 | 0.0482 |
0.0675 | 69.0 | 556416 | 0.5235 | 0.1473 | 0.0481 |
0.0652 | 70.0 | 564480 | 0.5468 | 0.1499 | 0.0488 |
0.0634 | 71.0 | 572544 | 0.5204 | 0.1488 | 0.0485 |
0.0623 | 72.0 | 580608 | 0.5535 | 0.1485 | 0.0483 |
0.0617 | 73.0 | 588672 | 0.5842 | 0.1456 | 0.0479 |
0.0598 | 74.0 | 596736 | 0.5706 | 0.1490 | 0.0481 |
0.0582 | 75.0 | 604800 | 0.5647 | 0.1469 | 0.0478 |
0.0568 | 76.0 | 612864 | 0.5678 | 0.1475 | 0.0484 |
0.0558 | 77.0 | 620928 | 0.5805 | 0.1469 | 0.0483 |
0.0549 | 78.0 | 628992 | 0.5655 | 0.1448 | 0.0474 |
0.0538 | 79.0 | 637056 | 0.5573 | 0.1446 | 0.0476 |
0.0524 | 80.0 | 645120 | 0.5953 | 0.1425 | 0.0472 |
0.0514 | 81.0 | 653184 | 0.6070 | 0.1422 | 0.0475 |
0.0503 | 82.0 | 661248 | 0.5991 | 0.1427 | 0.0468 |
0.0496 | 83.0 | 669312 | 0.6211 | 0.1421 | 0.0469 |
0.0479 | 84.0 | 677376 | 0.5988 | 0.1431 | 0.0470 |
0.0458 | 85.0 | 685440 | 0.6471 | 0.1418 | 0.0468 |
0.0463 | 86.0 | 693504 | 0.6437 | 0.1415 | 0.0469 |
0.0447 | 87.0 | 701568 | 0.6472 | 0.1415 | 0.0464 |
0.0449 | 88.0 | 709632 | 0.6418 | 0.1407 | 0.0465 |
0.043 | 89.0 | 717696 | 0.6302 | 0.1391 | 0.0461 |
0.0419 | 90.0 | 725760 | 0.6287 | 0.1417 | 0.0464 |
0.0402 | 91.0 | 733824 | 0.6573 | 0.1404 | 0.0464 |
0.0403 | 92.0 | 741888 | 0.6369 | 0.1397 | 0.0458 |
0.0397 | 93.0 | 749952 | 0.6820 | 0.1391 | 0.0459 |
0.0385 | 94.0 | 758016 | 0.6853 | 0.1380 | 0.0456 |
0.038 | 95.0 | 766080 | 0.6592 | 0.1384 | 0.0459 |
0.0372 | 96.0 | 774144 | 0.6826 | 0.1373 | 0.0454 |
0.0361 | 97.0 | 782208 | 0.6724 | 0.1371 | 0.0456 |
0.0355 | 98.0 | 790272 | 0.6876 | 0.1372 | 0.0455 |
0.0354 | 99.0 | 798336 | 0.6837 | 0.1372 | 0.0454 |
0.0354 | 100.0 | 806400 | 0.6946 | 0.1373 | 0.0454 |
Framework versions
- Transformers 4.46.1
- Pytorch 2.1.0+cu118
- Datasets 3.1.0
- Tokenizers 0.20.1