metadata
license: apache-2.0
base_model: openai/whisper-tiny.en
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-tiny-finetune
results: []
whisper-tiny-finetune
This model is a fine-tuned version of openai/whisper-tiny.en on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5924
- Wer: 20.5257
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 128
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 1000
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
3.9976 | 0.2778 | 10 | 3.9664 | 44.1176 |
3.9022 | 0.5556 | 20 | 3.8748 | 43.6170 |
3.7582 | 0.8333 | 30 | 3.7266 | 43.0538 |
3.5744 | 1.1111 | 40 | 3.5294 | 39.1427 |
3.2991 | 1.3889 | 50 | 3.2828 | 35.3567 |
3.0818 | 1.6667 | 60 | 2.9793 | 34.7935 |
2.6849 | 1.9444 | 70 | 2.5935 | 37.0150 |
2.2809 | 2.2222 | 80 | 2.0883 | 37.7034 |
1.6882 | 2.5 | 90 | 1.5027 | 38.6733 |
1.2127 | 2.7778 | 100 | 1.0275 | 39.0488 |
0.8696 | 3.0556 | 110 | 0.7946 | 29.4431 |
0.7336 | 3.3333 | 120 | 0.7152 | 28.1915 |
0.6793 | 3.6111 | 130 | 0.6666 | 27.2215 |
0.6489 | 3.8889 | 140 | 0.6346 | 26.0638 |
0.6353 | 4.1667 | 150 | 0.6115 | 24.8748 |
0.583 | 4.4444 | 160 | 0.5928 | 24.7497 |
0.5455 | 4.7222 | 170 | 0.5775 | 23.8110 |
0.487 | 5.0 | 180 | 0.5647 | 23.3417 |
0.4925 | 5.2778 | 190 | 0.5541 | 22.7159 |
0.4952 | 5.5556 | 200 | 0.5444 | 22.6533 |
0.4481 | 5.8333 | 210 | 0.5359 | 22.2153 |
0.4827 | 6.1111 | 220 | 0.5263 | 22.4030 |
0.3897 | 6.3889 | 230 | 0.5196 | 21.8085 |
0.3834 | 6.6667 | 240 | 0.5121 | 21.8711 |
0.3906 | 6.9444 | 250 | 0.5073 | 21.2140 |
0.3705 | 7.2222 | 260 | 0.5055 | 21.3705 |
0.3518 | 7.5 | 270 | 0.4980 | 21.2140 |
0.354 | 7.7778 | 280 | 0.4934 | 20.7447 |
0.3202 | 8.0556 | 290 | 0.4914 | 20.4318 |
0.2997 | 8.3333 | 300 | 0.4859 | 20.0563 |
0.2699 | 8.6111 | 310 | 0.4852 | 26.9399 |
0.2724 | 8.8889 | 320 | 0.4809 | 27.0338 |
0.2844 | 9.1667 | 330 | 0.4802 | 26.4393 |
0.2332 | 9.4444 | 340 | 0.4801 | 24.6558 |
0.2337 | 9.7222 | 350 | 0.4810 | 20.2753 |
0.2542 | 10.0 | 360 | 0.4731 | 20.5882 |
0.1986 | 10.2778 | 370 | 0.4779 | 20.1189 |
0.2023 | 10.5556 | 380 | 0.4767 | 24.6558 |
0.1864 | 10.8333 | 390 | 0.4763 | 20.3379 |
0.1873 | 11.1111 | 400 | 0.4765 | 20.6195 |
0.1595 | 11.3889 | 410 | 0.4831 | 20.4631 |
0.1581 | 11.6667 | 420 | 0.4872 | 20.2128 |
0.1663 | 11.9444 | 430 | 0.4851 | 20.0563 |
0.1282 | 12.2222 | 440 | 0.4864 | 19.9625 |
0.1138 | 12.5 | 450 | 0.4918 | 19.9937 |
0.1283 | 12.7778 | 460 | 0.4931 | 19.9312 |
0.0847 | 13.0556 | 470 | 0.4891 | 20.4944 |
0.0902 | 13.3333 | 480 | 0.5027 | 19.8999 |
0.0719 | 13.6111 | 490 | 0.5056 | 20.6821 |
0.1011 | 13.8889 | 500 | 0.5023 | 19.9937 |
0.0676 | 14.1667 | 510 | 0.5113 | 20.4005 |
0.0632 | 14.4444 | 520 | 0.5154 | 24.7184 |
0.0643 | 14.7222 | 530 | 0.5207 | 20.1502 |
0.053 | 15.0 | 540 | 0.5184 | 20.2753 |
0.0389 | 15.2778 | 550 | 0.5295 | 20.4631 |
0.0467 | 15.5556 | 560 | 0.5286 | 20.3066 |
0.0414 | 15.8333 | 570 | 0.5403 | 20.2753 |
0.0334 | 16.1111 | 580 | 0.5334 | 20.0876 |
0.0283 | 16.3889 | 590 | 0.5514 | 20.2441 |
0.0282 | 16.6667 | 600 | 0.5415 | 20.1815 |
0.0267 | 16.9444 | 610 | 0.5451 | 20.7447 |
0.019 | 17.2222 | 620 | 0.5483 | 20.3379 |
0.0202 | 17.5 | 630 | 0.5551 | 19.9625 |
0.0179 | 17.7778 | 640 | 0.5574 | 20.3066 |
0.0186 | 18.0556 | 650 | 0.5621 | 20.6821 |
0.0123 | 18.3333 | 660 | 0.5634 | 20.6195 |
0.0138 | 18.6111 | 670 | 0.5648 | 20.2753 |
0.0133 | 18.8889 | 680 | 0.5655 | 20.4318 |
0.0114 | 19.1667 | 690 | 0.5666 | 20.5569 |
0.0112 | 19.4444 | 700 | 0.5721 | 20.3379 |
0.0108 | 19.7222 | 710 | 0.5714 | 20.8385 |
0.0106 | 20.0 | 720 | 0.5744 | 20.4944 |
0.0092 | 20.2778 | 730 | 0.5751 | 20.4318 |
0.0096 | 20.5556 | 740 | 0.5756 | 20.3692 |
0.009 | 20.8333 | 750 | 0.5779 | 20.1502 |
0.0084 | 21.1111 | 760 | 0.5790 | 20.4944 |
0.0077 | 21.3889 | 770 | 0.5820 | 20.4005 |
0.0083 | 21.6667 | 780 | 0.5822 | 20.4005 |
0.008 | 21.9444 | 790 | 0.5820 | 20.4005 |
0.0077 | 22.2222 | 800 | 0.5829 | 20.4318 |
0.0083 | 22.5 | 810 | 0.5843 | 20.4005 |
0.0073 | 22.7778 | 820 | 0.5856 | 20.4005 |
0.0069 | 23.0556 | 830 | 0.5869 | 20.4005 |
0.0067 | 23.3333 | 840 | 0.5886 | 20.5257 |
0.007 | 23.6111 | 850 | 0.5882 | 20.4944 |
0.0074 | 23.8889 | 860 | 0.5872 | 20.4631 |
0.0073 | 24.1667 | 870 | 0.5885 | 20.4631 |
0.0066 | 24.4444 | 880 | 0.5896 | 20.6195 |
0.0061 | 24.7222 | 890 | 0.5898 | 20.6195 |
0.0073 | 25.0 | 900 | 0.5902 | 20.5882 |
0.0067 | 25.2778 | 910 | 0.5901 | 20.6508 |
0.006 | 25.5556 | 920 | 0.5905 | 20.5257 |
0.0061 | 25.8333 | 930 | 0.5911 | 20.7447 |
0.0064 | 26.1111 | 940 | 0.5916 | 20.6821 |
0.0066 | 26.3889 | 950 | 0.5919 | 20.6195 |
0.0071 | 26.6667 | 960 | 0.5924 | 20.5569 |
0.006 | 26.9444 | 970 | 0.5923 | 20.5569 |
0.0068 | 27.2222 | 980 | 0.5923 | 20.5257 |
0.0061 | 27.5 | 990 | 0.5924 | 20.5257 |
0.0058 | 27.7778 | 1000 | 0.5924 | 20.5257 |
Framework versions
- Transformers 4.40.1
- Pytorch 2.2.1+cu121
- Datasets 2.19.1.dev0
- Tokenizers 0.19.1