Edit model card

openai/whisper-small

This model is a fine-tuned version of openai/whisper-small on the pphuc25/EngMed dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0004
  • Wer: 21.9876
  • Cer: 17.0884

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.7291 1.0 386 0.3415 25.0142 20.1560
0.472 2.0 772 0.2196 27.4412 24.5293
0.2716 3.0 1158 0.1239 31.2274 28.4146
0.1409 4.0 1544 0.0740 38.1165 34.4281
0.1283 5.0 1930 0.0465 40.1171 35.9613
0.0631 6.0 2316 0.0310 36.5868 30.1992
0.0582 7.0 2702 0.0177 32.1775 26.2129
0.0531 8.0 3088 0.0131 31.8391 27.8444
0.0238 9.0 3474 0.0091 24.2508 18.5448
0.0149 10.0 3860 0.0060 24.8696 19.3220
0.0057 11.0 4246 0.0050 26.8193 21.7715
0.0077 12.0 4632 0.0031 23.0677 19.1910
0.0073 13.0 5018 0.0028 24.7584 19.3135
0.0052 14.0 5404 0.0014 25.8657 19.1331
0.0031 15.0 5790 0.0009 21.5274 17.0940
0.0039 16.0 6176 0.0007 22.1520 17.1637
0.0013 17.0 6562 0.0006 22.9021 17.7620
0.0009 18.0 6948 0.0005 21.9899 17.3717
0.001 19.0 7334 0.0004 22.1923 17.4027
0.0004 20.0 7720 0.0004 21.9876 17.0884

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Hanhpt23/whisper-small-engmed-free_ED3-11

Finetuned
(1885)
this model