MahtaFetrat's picture
HW5-tinywhisper
e2d7cee verified
metadata
license: apache-2.0
base_model: openai/whisper-tiny.en
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: whisper-tiny-finetune
    results: []

whisper-tiny-finetune

This model is a fine-tuned version of openai/whisper-tiny.en on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4696
  • Wer: 19.7170

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 128
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 90
  • training_steps: 180

Training results

Training Loss Epoch Step Validation Loss Wer
3.9026 0.2778 10 3.8176 40.1572
3.5226 0.5556 20 3.3623 34.1509
2.9307 0.8333 30 2.6468 32.2642
2.0034 1.1111 40 1.6171 33.7736
1.0812 1.3889 50 0.8580 29.9371
0.7061 1.6667 60 0.6809 27.0755
0.5945 1.9444 70 0.6080 25.1258
0.532 2.2222 80 0.5639 23.3962
0.5035 2.5 90 0.5357 22.2642
0.4955 2.7778 100 0.5125 21.0063
0.4208 3.0556 110 0.4960 20.6604
0.353 3.3333 120 0.4878 20.0
0.3453 3.6111 130 0.4817 19.8113
0.3945 3.8889 140 0.4768 19.8113
0.3273 4.1667 150 0.4727 19.8742
0.2941 4.4444 160 0.4713 19.7799
0.3131 4.7222 170 0.4708 19.2767
0.2948 5.0 180 0.4696 19.7170

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.3.dev0
  • Tokenizers 0.19.1