whisper-fine-tune / .ipynb_checkpoints /README-checkpoint.md
shreyasdesaisuperU's picture
Training in progress, step 1000
53d73b9 verified
metadata
library_name: transformers
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: Whisper Small superU
    results: []

Whisper Small superU

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 6.3374
  • Wer: 57.2770

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 128
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0 100.0 100 2.1912 59.6244
0.0 200.0 200 2.3111 58.2160
0.0 300.0 300 2.3886 58.2160
0.0 400.0 400 2.5319 60.5634
0.0 500.0 500 2.7160 60.0939
0.0 600.0 600 2.9609 61.0329
0.0 700.0 700 3.2141 61.0329
0.0 800.0 800 3.4591 61.5023
0.0 900.0 900 3.7213 62.4413
0.0 1000.0 1000 3.9804 61.0329
0.0 1100.0 1100 4.2234 78.4038
0.0 1200.0 1200 4.4138 63.3803
0.0 1300.0 1300 4.5889 77.9343
0.0 1400.0 1400 4.7946 70.8920
0.0 1500.0 1500 4.9337 65.7277
0.0 1600.0 1600 5.0758 56.8075
0.0 1700.0 1700 5.2692 56.8075
0.0 1800.0 1800 5.4087 56.8075
0.0 1900.0 1900 5.5500 56.8075
0.0 2000.0 2000 5.6783 56.8075
0.0 2100.0 2100 5.6287 56.8075
0.0 2200.0 2200 5.6852 56.3380
0.0 2300.0 2300 5.7374 56.3380
0.0 2400.0 2400 5.8023 56.3380
0.0 2500.0 2500 5.8672 57.2770
0.0 2600.0 2600 5.9427 57.2770
0.0 2700.0 2700 5.9891 57.2770
0.0 2800.0 2800 6.0490 57.2770
0.0 2900.0 2900 6.0639 57.2770
0.0 3000.0 3000 6.1095 57.2770
0.0 3100.0 3100 6.1477 57.2770
0.0 3200.0 3200 6.2039 57.2770
0.0 3300.0 3300 6.2346 57.2770
0.0 3400.0 3400 6.2567 57.2770
0.0 3500.0 3500 6.2841 57.2770
0.0 3600.0 3600 6.3028 57.2770
0.0 3700.0 3700 6.3029 57.2770
0.0 3800.0 3800 6.3294 57.2770
0.0 3900.0 3900 6.3346 57.2770
0.0 4000.0 4000 6.3374 57.2770

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1