metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-400hrs-v1
results: []
wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-400hrs-v1
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4397
- Wer: 0.1390
- Cer: 0.0457
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.9438 | 1.0 | 16876 | 0.5068 | 0.2899 | 0.0911 |
0.5501 | 2.0 | 33752 | 0.3912 | 0.2672 | 0.0823 |
0.4835 | 3.0 | 50628 | 0.3540 | 0.2436 | 0.0787 |
0.4402 | 4.0 | 67504 | 0.3511 | 0.2350 | 0.0727 |
0.4052 | 5.0 | 84380 | 0.3013 | 0.2234 | 0.0700 |
0.3786 | 6.0 | 101256 | 0.3334 | 0.2205 | 0.0725 |
0.3567 | 7.0 | 118132 | 0.3268 | 0.2080 | 0.0653 |
0.3347 | 8.0 | 135008 | 0.3131 | 0.2018 | 0.0646 |
0.3184 | 9.0 | 151884 | 0.2761 | 0.1928 | 0.0623 |
0.3035 | 10.0 | 168760 | 0.2957 | 0.1899 | 0.0594 |
0.288 | 11.0 | 185636 | 0.2986 | 0.1969 | 0.0677 |
0.2741 | 12.0 | 202512 | 0.2925 | 0.1833 | 0.0576 |
0.2636 | 13.0 | 219388 | 0.3275 | 0.1812 | 0.0593 |
0.2568 | 14.0 | 236264 | 0.2794 | 0.1791 | 0.0568 |
0.2413 | 15.0 | 253140 | 0.2805 | 0.1828 | 0.0594 |
0.2311 | 16.0 | 270016 | 0.3014 | 0.1716 | 0.0576 |
0.2218 | 17.0 | 286892 | 0.2842 | 0.1718 | 0.0556 |
0.2136 | 18.0 | 303768 | 0.2858 | 0.1692 | 0.0553 |
0.2036 | 19.0 | 320644 | 0.2833 | 0.1675 | 0.0536 |
0.1967 | 20.0 | 337520 | 0.2876 | 0.1628 | 0.0533 |
0.1878 | 21.0 | 354396 | 0.3176 | 0.1597 | 0.0519 |
0.1811 | 22.0 | 371272 | 0.2850 | 0.1643 | 0.0545 |
0.1739 | 23.0 | 388148 | 0.2613 | 0.1633 | 0.0530 |
0.1681 | 24.0 | 405024 | 0.2933 | 0.1571 | 0.0511 |
0.1623 | 25.0 | 421900 | 0.2958 | 0.1581 | 0.0509 |
0.157 | 26.0 | 438776 | 0.3099 | 0.1564 | 0.0507 |
0.1511 | 27.0 | 455652 | 0.3269 | 0.1572 | 0.0517 |
0.1465 | 28.0 | 472528 | 0.2733 | 0.1546 | 0.0501 |
0.1412 | 29.0 | 489404 | 0.2930 | 0.1549 | 0.0501 |
0.1371 | 30.0 | 506280 | 0.2686 | 0.1543 | 0.0501 |
0.1325 | 31.0 | 523156 | 0.3153 | 0.1531 | 0.0494 |
0.1283 | 32.0 | 540032 | 0.2964 | 0.1517 | 0.0493 |
0.1245 | 33.0 | 556908 | 0.3827 | 0.1507 | 0.0492 |
0.1215 | 34.0 | 573784 | 0.3416 | 0.1503 | 0.0493 |
0.1174 | 35.0 | 590660 | 0.3232 | 0.1499 | 0.0480 |
0.1149 | 36.0 | 607536 | 0.3119 | 0.1511 | 0.0500 |
0.1115 | 37.0 | 624412 | 0.3400 | 0.1506 | 0.0485 |
0.1076 | 38.0 | 641288 | 0.3502 | 0.1500 | 0.0493 |
0.1051 | 39.0 | 658164 | 0.3428 | 0.1479 | 0.0487 |
0.1022 | 40.0 | 675040 | 0.3596 | 0.1451 | 0.0484 |
0.1002 | 41.0 | 691916 | 0.3349 | 0.1492 | 0.0494 |
0.0978 | 42.0 | 708792 | 0.3514 | 0.1415 | 0.0468 |
0.0951 | 43.0 | 725668 | 0.3689 | 0.1447 | 0.0475 |
0.0925 | 44.0 | 742544 | 0.3720 | 0.1427 | 0.0472 |
0.0907 | 45.0 | 759420 | 0.3896 | 0.1448 | 0.0481 |
0.0882 | 46.0 | 776296 | 0.3680 | 0.1454 | 0.0486 |
0.0865 | 47.0 | 793172 | 0.3689 | 0.1445 | 0.0476 |
0.0849 | 48.0 | 810048 | 0.3505 | 0.1400 | 0.0460 |
0.0826 | 49.0 | 826924 | 0.3827 | 0.1392 | 0.0467 |
0.0804 | 50.0 | 843800 | 0.3828 | 0.1460 | 0.0481 |
0.0789 | 51.0 | 860676 | 0.4070 | 0.1419 | 0.0474 |
0.0774 | 52.0 | 877552 | 0.4041 | 0.1426 | 0.0473 |
0.075 | 53.0 | 894428 | 0.4212 | 0.1418 | 0.0474 |
0.073 | 54.0 | 911304 | 0.4072 | 0.1400 | 0.0468 |
0.0718 | 55.0 | 928180 | 0.4032 | 0.1388 | 0.0460 |
0.0694 | 56.0 | 945056 | 0.4591 | 0.1432 | 0.0474 |
0.0685 | 57.0 | 961932 | 0.4426 | 0.1384 | 0.0460 |
0.0671 | 58.0 | 978808 | 0.4397 | 0.1390 | 0.0457 |
Framework versions
- Transformers 4.46.1
- Pytorch 2.1.0+cu118
- Datasets 3.1.0
- Tokenizers 0.20.1