|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
base_model: facebook/wav2vec2-large-xlsr-53 |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- wer |
|
model-index: |
|
- name: xlsr-big-kznnn |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# xlsr-big-kznnn |
|
|
|
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.0000 |
|
- Wer: 0.0559 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0004 |
|
- train_batch_size: 8 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 2 |
|
- total_train_batch_size: 16 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 132 |
|
- num_epochs: 100 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Wer | |
|
|:-------------:|:-------:|:-----:|:---------------:|:------:| |
|
| 2.1554 | 1.7167 | 200 | 0.9501 | 0.7058 | |
|
| 0.6034 | 3.4335 | 400 | 0.0748 | 0.1296 | |
|
| 0.1588 | 5.1502 | 600 | 0.0226 | 0.0679 | |
|
| 0.0906 | 6.8670 | 800 | 0.0070 | 0.0569 | |
|
| 0.0597 | 8.5837 | 1000 | 0.0181 | 0.0977 | |
|
| 0.0523 | 10.3004 | 1200 | 0.0038 | 0.0547 | |
|
| 0.0441 | 12.0172 | 1400 | 0.0028 | 0.0533 | |
|
| 0.0362 | 13.7339 | 1600 | 0.0030 | 0.0593 | |
|
| 0.0272 | 15.4506 | 1800 | 0.0055 | 0.0611 | |
|
| 0.0286 | 17.1674 | 2000 | 0.0021 | 0.0565 | |
|
| 0.0224 | 18.8841 | 2200 | 0.0036 | 0.0748 | |
|
| 0.0227 | 20.6009 | 2400 | 0.0023 | 0.0543 | |
|
| 0.0175 | 22.3176 | 2600 | 0.0075 | 0.0569 | |
|
| 0.0172 | 24.0343 | 2800 | 0.0031 | 0.0547 | |
|
| 0.0226 | 25.7511 | 3000 | 0.0027 | 0.0593 | |
|
| 0.0132 | 27.4678 | 3200 | 0.0012 | 0.0535 | |
|
| 0.0179 | 29.1845 | 3400 | 0.0022 | 0.0565 | |
|
| 0.0149 | 30.9013 | 3600 | 0.0014 | 0.0531 | |
|
| 0.0141 | 32.6180 | 3800 | 0.0010 | 0.0533 | |
|
| 0.0146 | 34.3348 | 4000 | 0.0020 | 0.0565 | |
|
| 0.0143 | 36.0515 | 4200 | 0.0002 | 0.0605 | |
|
| 0.0117 | 37.7682 | 4400 | 0.0047 | 0.0567 | |
|
| 0.0138 | 39.4850 | 4600 | 0.0011 | 0.0561 | |
|
| 0.0095 | 41.2017 | 4800 | 0.0002 | 0.0738 | |
|
| 0.0096 | 42.9185 | 5000 | 0.0001 | 0.0697 | |
|
| 0.0083 | 44.6352 | 5200 | 0.0019 | 0.0601 | |
|
| 0.0101 | 46.3519 | 5400 | 0.0010 | 0.0695 | |
|
| 0.0078 | 48.0687 | 5600 | 0.0002 | 0.0571 | |
|
| 0.0105 | 49.7854 | 5800 | 0.0002 | 0.0537 | |
|
| 0.0065 | 51.5021 | 6000 | 0.0038 | 0.0681 | |
|
| 0.0083 | 53.2189 | 6200 | 0.0000 | 0.0645 | |
|
| 0.0065 | 54.9356 | 6400 | 0.0002 | 0.0543 | |
|
| 0.0076 | 56.6524 | 6600 | 0.0005 | 0.0609 | |
|
| 0.0055 | 58.3691 | 6800 | 0.0015 | 0.0557 | |
|
| 0.0054 | 60.0858 | 7000 | 0.0001 | 0.0529 | |
|
| 0.0078 | 61.8026 | 7200 | 0.0002 | 0.0525 | |
|
| 0.0059 | 63.5193 | 7400 | 0.0002 | 0.0531 | |
|
| 0.0051 | 65.2361 | 7600 | 0.0000 | 0.0535 | |
|
| 0.0047 | 66.9528 | 7800 | 0.0000 | 0.0537 | |
|
| 0.0045 | 68.6695 | 8000 | 0.0000 | 0.0549 | |
|
| 0.0044 | 70.3863 | 8200 | 0.0014 | 0.0599 | |
|
| 0.0051 | 72.1030 | 8400 | 0.0000 | 0.0551 | |
|
| 0.003 | 73.8197 | 8600 | 0.0003 | 0.0547 | |
|
| 0.0028 | 75.5365 | 8800 | 0.0000 | 0.0518 | |
|
| 0.0027 | 77.2532 | 9000 | 0.0000 | 0.0520 | |
|
| 0.0024 | 78.9700 | 9200 | 0.0000 | 0.0569 | |
|
| 0.0022 | 80.6867 | 9400 | 0.0000 | 0.0565 | |
|
| 0.0027 | 82.4034 | 9600 | 0.0000 | 0.0516 | |
|
| 0.0018 | 84.1202 | 9800 | 0.0000 | 0.0520 | |
|
| 0.0028 | 85.8369 | 10000 | 0.0000 | 0.0569 | |
|
| 0.002 | 87.5536 | 10200 | 0.0000 | 0.0545 | |
|
| 0.0014 | 89.2704 | 10400 | 0.0000 | 0.0535 | |
|
| 0.0012 | 90.9871 | 10600 | 0.0000 | 0.0525 | |
|
| 0.0011 | 92.7039 | 10800 | 0.0000 | 0.0531 | |
|
| 0.0011 | 94.4206 | 11000 | 0.0000 | 0.0539 | |
|
| 0.0009 | 96.1373 | 11200 | 0.0000 | 0.0555 | |
|
| 0.0016 | 97.8541 | 11400 | 0.0000 | 0.0563 | |
|
| 0.0011 | 99.5708 | 11600 | 0.0000 | 0.0559 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.45.0.dev0 |
|
- Pytorch 2.1.2 |
|
- Datasets 2.20.0 |
|
- Tokenizers 0.19.1 |
|
|