|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
base_model: openai/whisper-small |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- wer |
|
model-index: |
|
- name: whisper-ai-nose |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# whisper-ai-nose |
|
|
|
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.0000 |
|
- Wer: 14.3032 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0004 |
|
- train_batch_size: 8 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 2 |
|
- total_train_batch_size: 16 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 132 |
|
- num_epochs: 30 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Wer | |
|
|:-------------:|:-------:|:----:|:---------------:|:--------:| |
|
| 0.8578 | 0.8163 | 100 | 0.0827 | 9.4132 | |
|
| 0.2317 | 1.6327 | 200 | 0.0545 | 18.9487 | |
|
| 0.079 | 2.4490 | 300 | 0.0488 | 204.5232 | |
|
| 0.0586 | 3.2653 | 400 | 0.0493 | 227.9951 | |
|
| 0.0396 | 4.0816 | 500 | 0.0277 | 23.9609 | |
|
| 0.1368 | 4.8980 | 600 | 0.2826 | 94.6210 | |
|
| 0.1721 | 5.7143 | 700 | 0.0024 | 145.1100 | |
|
| 0.0246 | 6.5306 | 800 | 0.0024 | 22.9829 | |
|
| 0.0143 | 7.3469 | 900 | 0.0008 | 25.1834 | |
|
| 0.0185 | 8.1633 | 1000 | 0.0026 | 69.4377 | |
|
| 0.0171 | 8.9796 | 1100 | 0.0069 | 22.4939 | |
|
| 0.0229 | 9.7959 | 1200 | 0.0004 | 43.6430 | |
|
| 0.0033 | 10.6122 | 1300 | 0.0018 | 16.8704 | |
|
| 0.0073 | 11.4286 | 1400 | 0.0076 | 14.6699 | |
|
| 0.0053 | 12.2449 | 1500 | 0.0030 | 14.4254 | |
|
| 0.0038 | 13.0612 | 1600 | 0.0027 | 13.0807 | |
|
| 0.0004 | 13.8776 | 1700 | 0.0000 | 13.4474 | |
|
| 0.0008 | 14.6939 | 1800 | 0.0001 | 11.4914 | |
|
| 0.0014 | 15.5102 | 1900 | 0.0002 | 11.9804 | |
|
| 0.0047 | 16.3265 | 2000 | 0.0002 | 14.9144 | |
|
| 0.0031 | 17.1429 | 2100 | 0.0001 | 15.1589 | |
|
| 0.0018 | 17.9592 | 2200 | 0.0002 | 15.7702 | |
|
| 0.0019 | 18.7755 | 2300 | 0.0000 | 15.1589 | |
|
| 0.0006 | 19.5918 | 2400 | 0.0000 | 15.0367 | |
|
| 0.0001 | 20.4082 | 2500 | 0.0000 | 14.0587 | |
|
| 0.0001 | 21.2245 | 2600 | 0.0000 | 14.4254 | |
|
| 0.0 | 22.0408 | 2700 | 0.0000 | 14.1809 | |
|
| 0.0 | 22.8571 | 2800 | 0.0000 | 14.5477 | |
|
| 0.0 | 23.6735 | 2900 | 0.0000 | 14.4254 | |
|
| 0.0 | 24.4898 | 3000 | 0.0000 | 14.4254 | |
|
| 0.0 | 25.3061 | 3100 | 0.0000 | 14.4254 | |
|
| 0.0 | 26.1224 | 3200 | 0.0000 | 14.3032 | |
|
| 0.0 | 26.9388 | 3300 | 0.0000 | 14.3032 | |
|
| 0.0 | 27.7551 | 3400 | 0.0000 | 14.4254 | |
|
| 0.0 | 28.5714 | 3500 | 0.0000 | 14.3032 | |
|
| 0.0 | 29.3878 | 3600 | 0.0000 | 14.3032 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.45.0.dev0 |
|
- Pytorch 2.4.0 |
|
- Datasets 2.21.0 |
|
- Tokenizers 0.19.1 |
|
|