whisper-ai-norm / README.md
susmitabhatt's picture
End of training
2e33c5b verified
metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: whisper-ai-norm
    results: []

whisper-ai-norm

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Wer: 101.5892

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.2583 0.8584 100 17.1932 305.9902
2.067 1.7167 200 0.3327 123.2274
0.3124 2.5751 300 0.2674 94.8655
0.2537 3.4335 400 0.2513 85.8191
0.2072 4.2918 500 0.2029 82.6406
0.212 5.1502 600 0.1995 88.6308
0.2077 6.0086 700 0.1327 69.8044
0.1414 6.8670 800 0.0846 41.0758
0.1015 7.7253 900 0.0917 41.6870
0.102 8.5837 1000 0.0298 27.2616
0.0559 9.4421 1100 0.0389 21.0269
0.054 10.3004 1200 0.0218 202.5672
0.0356 11.1588 1300 0.0103 231.9071
0.0304 12.0172 1400 0.0236 218.4597
0.0199 12.8755 1500 0.0060 313.6919
0.0204 13.7339 1600 0.0104 202.4450
0.018 14.5923 1700 0.0069 190.2200
0.0103 15.4506 1800 0.0029 28.3619
0.0078 16.3090 1900 0.0029 133.0073
0.0049 17.1674 2000 0.0001 194.7433
0.0021 18.0258 2100 0.0003 183.6186
0.0026 18.8841 2200 0.0008 143.7653
0.0014 19.7425 2300 0.0000 145.5990
0.0006 20.6009 2400 0.0002 65.0367
0.0013 21.4592 2500 0.0008 63.2029
0.0018 22.3176 2600 0.0001 84.2298
0.0003 23.1760 2700 0.0000 121.6381
0.0002 24.0343 2800 0.0000 99.5110
0.0 24.8927 2900 0.0000 100.3667
0.0 25.7511 3000 0.0000 100.4890
0.0 26.6094 3100 0.0000 101.1002
0.0 27.4678 3200 0.0000 101.3447
0.0 28.3262 3300 0.0000 101.3447
0.0 29.1845 3400 0.0000 101.5892

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1