Edit model card

openai/whisper-small

This model is a fine-tuned version of openai/whisper-small on the Hanhpt23/GermanMed-full dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5327
  • Wer: 22.8736
  • Cer: 15.0096

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.4351 1.0 194 0.4422 66.5535 48.8609
0.2391 2.0 388 0.4250 54.4071 37.4409
0.1426 3.0 582 0.4290 56.2172 35.4555
0.0877 4.0 776 0.4469 55.1682 35.9389
0.0456 5.0 970 0.4723 29.5691 19.8569
0.0374 6.0 1164 0.4664 34.9275 22.5023
0.015 7.0 1358 0.4901 24.7249 16.3800
0.0179 8.0 1552 0.4805 29.3325 19.8240
0.0112 9.0 1746 0.4858 25.3317 16.7420
0.0064 10.0 1940 0.4905 25.7534 16.3488
0.0037 11.0 2134 0.5068 26.1648 16.7871
0.0027 12.0 2328 0.5067 23.2747 15.2504
0.0013 13.0 2522 0.5167 25.7842 16.4181
0.0009 14.0 2716 0.5195 23.0690 15.1586
0.0008 15.0 2910 0.5206 23.1719 15.2106
0.0007 16.0 3104 0.5260 23.0382 14.9230
0.0007 17.0 3298 0.5288 23.1719 15.1118
0.0006 18.0 3492 0.5310 23.1821 15.1465
0.0006 19.0 3686 0.5323 22.8633 15.0079
0.0006 20.0 3880 0.5327 22.8736 15.0096

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Hanhpt23/whisper-small-germanmed-free_ED3-11

Finetuned
(1885)
this model