metadata
base_model: facebook/wav2vec2-large-xlsr-53
library_name: transformers
license: apache-2.0
metrics:
- wer
tags:
- generated_from_trainer
model-index:
- name: xlsr-nomi-nmcpc
results: []
xlsr-nomi-nmcpc
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0000
- Wer: 0.2574
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 132
- num_epochs: 200
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
4.7569 | 3.8835 | 200 | 3.0327 | 1.0 |
2.9033 | 7.7670 | 400 | 2.5777 | 1.0 |
2.006 | 11.6505 | 600 | 0.7689 | 0.7298 |
0.7612 | 15.5340 | 800 | 0.2026 | 0.4191 |
0.3581 | 19.4175 | 1000 | 0.0966 | 0.3489 |
0.2432 | 23.3010 | 1200 | 0.0406 | 0.3128 |
0.1625 | 27.1845 | 1400 | 0.0479 | 0.3106 |
0.1391 | 31.0680 | 1600 | 0.0172 | 0.2915 |
0.1157 | 34.9515 | 1800 | 0.0110 | 0.2851 |
0.1016 | 38.8350 | 2000 | 0.0096 | 0.2766 |
0.0807 | 42.7184 | 2200 | 0.0162 | 0.2723 |
0.0673 | 46.6019 | 2400 | 0.0093 | 0.2681 |
0.0695 | 50.4854 | 2600 | 0.0305 | 0.2851 |
0.0589 | 54.3689 | 2800 | 0.0058 | 0.2638 |
0.0508 | 58.2524 | 3000 | 0.0172 | 0.2660 |
0.0503 | 62.1359 | 3200 | 0.0070 | 0.2660 |
0.0516 | 66.0194 | 3400 | 0.0017 | 0.2660 |
0.0435 | 69.9029 | 3600 | 0.0037 | 0.2617 |
0.0378 | 73.7864 | 3800 | 0.0020 | 0.2638 |
0.0325 | 77.6699 | 4000 | 0.0061 | 0.2702 |
0.031 | 81.5534 | 4200 | 0.0036 | 0.2617 |
0.0292 | 85.4369 | 4400 | 0.0075 | 0.2660 |
0.0281 | 89.3204 | 4600 | 0.0006 | 0.2532 |
0.0334 | 93.2039 | 4800 | 0.0007 | 0.2574 |
0.0307 | 97.0874 | 5000 | 0.0029 | 0.2596 |
0.0232 | 100.9709 | 5200 | 0.0025 | 0.2553 |
0.0228 | 104.8544 | 5400 | 0.0010 | 0.2596 |
0.0216 | 108.7379 | 5600 | 0.0014 | 0.2638 |
0.0207 | 112.6214 | 5800 | 0.0005 | 0.2596 |
0.0163 | 116.5049 | 6000 | 0.0021 | 0.2596 |
0.0232 | 120.3883 | 6200 | 0.0173 | 0.2638 |
0.0247 | 124.2718 | 6400 | 0.0102 | 0.2596 |
0.0198 | 128.1553 | 6600 | 0.0003 | 0.2553 |
0.0124 | 132.0388 | 6800 | 0.0001 | 0.2574 |
0.0117 | 135.9223 | 7000 | 0.0001 | 0.2574 |
0.0131 | 139.8058 | 7200 | 0.0002 | 0.2574 |
0.0119 | 143.6893 | 7400 | 0.0001 | 0.2574 |
0.0103 | 147.5728 | 7600 | 0.0001 | 0.2574 |
0.0083 | 151.4563 | 7800 | 0.0001 | 0.2574 |
0.0115 | 155.3398 | 8000 | 0.0001 | 0.2574 |
0.0112 | 159.2233 | 8200 | 0.0001 | 0.2574 |
0.0128 | 163.1068 | 8400 | 0.0001 | 0.2574 |
0.0062 | 166.9903 | 8600 | 0.0001 | 0.2574 |
0.0057 | 170.8738 | 8800 | 0.0000 | 0.2574 |
0.0072 | 174.7573 | 9000 | 0.0000 | 0.2574 |
0.0061 | 178.6408 | 9200 | 0.0000 | 0.2574 |
0.0047 | 182.5243 | 9400 | 0.0000 | 0.2574 |
0.0063 | 186.4078 | 9600 | 0.0000 | 0.2574 |
0.0043 | 190.2913 | 9800 | 0.0000 | 0.2574 |
0.006 | 194.1748 | 10000 | 0.0000 | 0.2574 |
0.0042 | 198.0583 | 10200 | 0.0000 | 0.2574 |
Framework versions
- Transformers 4.45.0.dev0
- Pytorch 2.4.0
- Datasets 2.21.0
- Tokenizers 0.19.1