whisper-base-bn-f / README.md
raiyan007's picture
End of training
4d1dd7d verified
|
raw
history blame
No virus
2.9 kB
metadata
language:
  - bn
license: apache-2.0
base_model: openai/whisper-base
tags:
  - generated_from_trainer
datasets:
  - mozilla-foundation/common_voice_11_0
metrics:
  - wer
model-index:
  - name: Whisper Base Bn - Raiyan Ahmed
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: Common Voice 11.0
          type: mozilla-foundation/common_voice_11_0
          config: bn
          split: None
          args: 'config: bn, split: test'
        metrics:
          - name: Wer
            type: wer
            value: 33.54106242324475

Whisper Base Bn - Raiyan Ahmed

This model is a fine-tuned version of openai/whisper-base on the Common Voice 11.0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2026
  • Wer: 33.5411

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3.75e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 16000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.2369 0.6365 1000 0.2433 62.1881
0.1242 1.2731 2000 0.1734 49.4369
0.1022 1.9096 3000 0.1197 39.0531
0.046 2.5461 4000 0.1067 34.5497
0.0777 3.1827 5000 0.1440 43.2194
0.0649 3.8192 6000 0.1266 38.6232
0.0367 4.4558 7000 0.1288 38.0392
0.0126 5.0923 8000 0.1382 35.0226
0.0108 5.7288 9000 0.1416 34.5340
0.0038 6.3654 10000 0.1611 33.3921
0.0023 7.0019 11000 0.1744 33.4875
0.0133 7.6384 12000 0.1625 36.0534
0.0066 8.2750 13000 0.1801 35.3936
0.004 8.9115 14000 0.1781 34.1577
0.0009 9.5481 15000 0.1918 33.6939
0.0003 10.1846 16000 0.2026 33.5411

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1