Edit model card

Whisper Small ko

This model is a fine-tuned version of openai/whisper-small on the customdata dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0268
  • Cer: 6.5045
  • Wer: 6.9310

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
3.6391 0.54 25 3.3230 83.6552 35.4340
2.7648 1.09 50 2.3011 81.2725 31.6473
1.8272 1.63 75 1.4490 85.9460 43.8688
1.0827 2.17 100 0.8137 72.8033 59.1524
0.6201 2.72 125 0.4756 50.5476 49.9522
0.3539 3.26 150 0.3005 31.1094 31.5926
0.2358 3.8 175 0.1969 29.5962 31.3192
0.1501 4.35 200 0.1352 21.1688 21.7772
0.0967 4.89 225 0.0846 18.6941 19.0431
0.0471 5.43 250 0.0350 18.3931 18.9200
0.0162 5.98 275 0.0335 18.9616 19.5215
0.0121 6.52 300 0.0324 14.1293 15.5707
0.011 7.07 325 0.0261 12.9755 14.3267
0.0078 7.61 350 0.0223 9.3220 10.5400
0.0075 8.15 375 0.0217 5.8106 6.5482
0.0052 8.7 400 0.0208 7.9926 8.6945
0.0048 9.24 425 0.0213 5.3424 5.7280
0.0053 9.78 450 0.0212 7.5328 7.9973
0.004 10.33 475 0.0213 5.7186 5.9740
0.0054 10.87 500 0.0268 6.5045 6.9310

Framework versions

  • Transformers 4.39.2
  • Pytorch 2.0.1
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
2
Safetensors
Model size
242M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for GGarri/241002_whisperfinetuned

Finetuned
(1954)
this model

Dataset used to train GGarri/241002_whisperfinetuned

Evaluation results