Edit model card

w2v-bert-2.0-ln-afrivoice-10hr-v3

This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5394
  • Model Preparation Time: 0.0157
  • Wer: 0.2805
  • Cer: 0.0704

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.033
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
4.8449 0.9919 61 2.7369 0.0157 0.9995 0.9870
1.9412 2.0 123 0.9095 0.0157 0.4841 0.1700
0.7831 2.9919 184 0.8914 0.0157 0.4018 0.1424
0.67 4.0 246 0.7633 0.0157 0.3919 0.1391
0.5982 4.9919 307 0.8712 0.0157 0.3514 0.1349
0.5577 6.0 369 0.6596 0.0157 0.4425 0.1566
0.4945 6.9919 430 0.7157 0.0157 0.3838 0.1419
0.4363 8.0 492 0.7981 0.0157 0.3582 0.1324
0.395 8.9919 553 0.7956 0.0157 0.3483 0.1310
0.3416 10.0 615 0.7110 0.0157 0.4082 0.1571
0.3181 10.9919 676 0.8728 0.0157 0.3680 0.1334
0.2837 12.0 738 0.8389 0.0157 0.3656 0.1361
0.2482 12.9919 799 0.9984 0.0157 0.3582 0.1296
0.224 14.0 861 0.8696 0.0157 0.3971 0.1515
0.204 14.9919 922 1.0671 0.0157 0.3563 0.1312
0.1665 16.0 984 1.0956 0.0157 0.3622 0.1329
0.1507 16.9919 1045 1.4699 0.0157 0.3481 0.1297
0.1144 18.0 1107 1.4821 0.0157 0.3566 0.1299
0.1327 18.9919 1168 1.2253 0.0157 0.3699 0.1352
0.1085 20.0 1230 1.2042 0.0157 0.3929 0.1452
0.0694 20.9919 1291 1.4515 0.0157 0.3681 0.1317
0.0476 22.0 1353 1.5795 0.0157 0.3551 0.1301
0.0357 22.9919 1414 1.5949 0.0157 0.3527 0.1300
0.0241 24.0 1476 1.7094 0.0157 0.3555 0.1304
0.017 24.9919 1537 1.7941 0.0157 0.3577 0.1311
0.0128 26.0 1599 1.8157 0.0157 0.3555 0.1300
0.0132 26.9919 1660 1.8541 0.0157 0.3621 0.1324

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
606M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for KasuleTrevor/w2v-bert-2.0-ln-afrivoice-10hr-v3

Finetuned
(183)
this model