KasuleTrevor's picture
End of training
11ba9ba verified
metadata
library_name: transformers
language:
  - ln
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
datasets:
  - DigitalUmuganda/AfriVoice
metrics:
  - wer
model-index:
  - name: Whisper Small Hi - Sanchit Gandhi
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: AfriVoice
          type: DigitalUmuganda/AfriVoice
          args: 'config: sw, split: test'
        metrics:
          - name: Wer
            type: wer
            value: 0.34790486103080065

Whisper Small Hi - Sanchit Gandhi

This model is a fine-tuned version of openai/whisper-small on the AfriVoice dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1668
  • Wer: 0.3479
  • Cer: 0.1439

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.6425 1.0 123 1.0239 0.5476 0.2091
0.8057 2.0 246 0.7465 0.4388 0.1797
0.563 3.0 369 0.6723 0.3866 0.1537
0.4016 4.0 492 0.6578 0.3876 0.1594
0.2677 5.0 615 0.6799 0.3976 0.1730
0.1708 6.0 738 0.7148 0.3929 0.1585
0.1146 7.0 861 0.7365 0.3840 0.1574
0.0786 8.0 984 0.7790 0.3839 0.1555
0.0646 9.0 1107 0.8027 0.3846 0.1584
0.0587 10.0 1230 0.8323 0.3857 0.1589
0.0527 11.0 1353 0.8354 0.3870 0.1548
0.0442 12.0 1476 0.8494 0.3800 0.1554
0.0352 13.0 1599 0.8863 0.3957 0.1577
0.0317 14.0 1722 0.8744 0.4027 0.1716
0.0289 15.0 1845 0.8952 0.3723 0.1566
0.0249 16.0 1968 0.9174 0.3728 0.1513
0.022 17.0 2091 0.9266 0.3764 0.1608
0.0197 18.0 2214 0.9473 0.3840 0.1608
0.0182 19.0 2337 0.9502 0.3780 0.1549
0.0161 20.0 2460 0.9454 0.3731 0.1536
0.0139 21.0 2583 0.9579 0.3716 0.1557
0.0128 22.0 2706 0.9722 0.3658 0.1510
0.0107 23.0 2829 0.9775 0.3738 0.1585
0.0116 24.0 2952 0.9750 0.3721 0.1587
0.0118 25.0 3075 0.9746 0.3706 0.1584
0.0108 26.0 3198 1.0020 0.3743 0.1617
0.0092 27.0 3321 0.9905 0.3639 0.1531
0.0086 28.0 3444 1.0039 0.3683 0.1546
0.0092 29.0 3567 1.0071 0.3697 0.1557
0.0071 30.0 3690 1.0074 0.3656 0.1502
0.0049 31.0 3813 1.0162 0.3621 0.1558
0.0074 32.0 3936 1.0396 0.3667 0.1592
0.0057 33.0 4059 1.0447 0.3684 0.1592
0.005 34.0 4182 1.0465 0.3658 0.1541
0.0051 35.0 4305 1.0388 0.3662 0.1539
0.0061 36.0 4428 1.0512 0.3581 0.1470
0.005 37.0 4551 1.0480 0.3668 0.1552
0.005 38.0 4674 1.0476 0.3641 0.1558
0.0046 39.0 4797 1.0402 0.3656 0.1499
0.0028 40.0 4920 1.0726 0.3613 0.1481
0.0028 41.0 5043 1.0624 0.3575 0.1463
0.0018 42.0 5166 1.0882 0.3599 0.1510
0.0033 43.0 5289 1.0653 0.3594 0.1506
0.0042 44.0 5412 1.0725 0.3586 0.1494
0.0039 45.0 5535 1.0754 0.3610 0.1489
0.003 46.0 5658 1.0761 0.3682 0.1551
0.0026 47.0 5781 1.0815 0.3667 0.1561
0.0026 48.0 5904 1.0740 0.3676 0.1527
0.0026 49.0 6027 1.0762 0.3604 0.1501
0.0019 50.0 6150 1.1023 0.3563 0.1485
0.0016 51.0 6273 1.0775 0.3564 0.1475
0.0017 52.0 6396 1.0821 0.3535 0.1479
0.001 53.0 6519 1.0904 0.3523 0.1448
0.0005 54.0 6642 1.0975 0.3523 0.1453
0.0004 55.0 6765 1.1008 0.3503 0.1463
0.0003 56.0 6888 1.1026 0.3547 0.1493
0.0001 57.0 7011 1.1079 0.3510 0.1454
0.0001 58.0 7134 1.1128 0.3503 0.1449
0.0001 59.0 7257 1.1175 0.3501 0.1454
0.0001 60.0 7380 1.1217 0.3503 0.1448
0.0001 61.0 7503 1.1255 0.3506 0.1449
0.0 62.0 7626 1.1289 0.3499 0.1446
0.0 63.0 7749 1.1320 0.3501 0.1447
0.0 64.0 7872 1.1349 0.3493 0.1443
0.0 65.0 7995 1.1377 0.3491 0.1442
0.0 66.0 8118 1.1403 0.3490 0.1441
0.0 67.0 8241 1.1427 0.3486 0.1442
0.0 68.0 8364 1.1452 0.3486 0.1441
0.0 69.0 8487 1.1474 0.3489 0.1441
0.0 70.0 8610 1.1495 0.3490 0.1441
0.0 71.0 8733 1.1517 0.3485 0.1439
0.0 72.0 8856 1.1538 0.3484 0.1437
0.0 73.0 8979 1.1557 0.3483 0.1437
0.0 74.0 9102 1.1577 0.3483 0.1437
0.0 75.0 9225 1.1597 0.3483 0.1438
0.0 76.0 9348 1.1615 0.3483 0.1438
0.0 77.0 9471 1.1633 0.3484 0.1441
0.0 78.0 9594 1.1651 0.3482 0.1439
0.0 79.0 9717 1.1668 0.3479 0.1439
0.0 80.0 9840 1.1685 0.3480 0.1439
0.0 81.0 9963 1.1701 0.3481 0.1439
0.0 82.0 10086 1.1717 0.3480 0.1439
0.0 83.0 10209 1.1732 0.3486 0.1441
0.0 84.0 10332 1.1747 0.3485 0.1437
0.0 85.0 10455 1.1761 0.3489 0.1438
0.0 86.0 10578 1.1774 0.3489 0.1439
0.0 87.0 10701 1.1788 0.3489 0.1437
0.0 88.0 10824 1.1800 0.3490 0.1436
0.0 89.0 10947 1.1812 0.3491 0.1438

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.0+cu118
  • Datasets 3.0.0
  • Tokenizers 0.19.1