Edit model card

openai/whisper-small

This model is a fine-tuned version of openai/whisper-small on the Hanhpt23/GermanMed-full dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6983
  • Wer: 23.1307
  • Cer: 14.7116

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.543 1.0 194 0.5713 37.3547 23.8778
0.2814 2.0 388 0.5356 27.0287 17.3085
0.1367 3.0 582 0.5955 35.5446 23.1433
0.0903 4.0 776 0.6163 31.9037 20.3333
0.0768 5.0 970 0.6262 27.4607 16.8373
0.0585 6.0 1164 0.6513 29.8262 18.5455
0.0507 7.0 1358 0.6889 28.8388 18.3237
0.0388 8.0 1552 0.6816 28.1497 17.8282
0.0277 9.0 1746 0.6981 23.3673 14.2699
0.0167 10.0 1940 0.6921 24.5192 15.6108
0.0136 11.0 2134 0.7047 24.4678 15.5224
0.0092 12.0 2328 0.7167 24.5089 15.4271
0.004 13.0 2522 0.7071 24.9511 15.7251
0.0009 14.0 2716 0.7150 23.0999 14.4743
0.0009 15.0 2910 0.6984 23.5318 14.8035
0.0002 16.0 3104 0.7018 23.6758 15.0980
0.0004 17.0 3298 0.6949 22.8222 14.2993
0.0001 18.0 3492 0.6972 23.2130 14.7255
0.0001 19.0 3686 0.6980 23.2027 14.7601
0.0003 20.0 3880 0.6983 23.1307 14.7116

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Hanhpt23/whisper-small-germanmed-free_E3-11

Finetuned
(1885)
this model