whisper-tiny-en / README.md
arielcerdap's picture
End of training
8acdbf6 verified
|
raw
history blame
No virus
2.36 kB
metadata
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - generated_from_trainer
datasets:
  - PolyAI/minds14
metrics:
  - wer
model-index:
  - name: whisper-tiny-en
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: PolyAI/minds14
          type: PolyAI/minds14
          config: en-US
          split: train
          args: en-US
        metrics:
          - name: Wer
            type: wer
            value: 32.99881936245573

whisper-tiny-en

This model is a fine-tuned version of openai/whisper-tiny on the PolyAI/minds14 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8597
  • Wer Ortho: 32.7576
  • Wer: 32.9988

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant_with_warmup
  • lr_scheduler_warmup_steps: 50
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Ortho Wer
0.0007 17.2414 500 0.6479 32.3874 32.3495
0.0002 34.4828 1000 0.7071 32.8809 32.9988
0.0001 51.7241 1500 0.7428 32.7576 32.9988
0.0001 68.9655 2000 0.7709 32.6959 32.9398
0.0 86.2069 2500 0.7948 32.7576 33.0579
0.0 103.4483 3000 0.8179 33.0043 33.2349
0.0 120.6897 3500 0.8392 32.9426 33.1759
0.0 137.9310 4000 0.8597 32.7576 32.9988

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1