w2v-bert-2.0-ln-afrivoice-10hr-v4
This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4715
- Model Preparation Time: 0.0145
- Wer: 0.2768
- Cer: 0.0710
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer | Cer |
---|---|---|---|---|---|---|
3.6215 | 0.9919 | 61 | 1.2506 | 0.0145 | 0.9280 | 0.3210 |
0.855 | 2.0 | 123 | 0.7710 | 0.0145 | 0.4019 | 0.1489 |
0.6812 | 2.9919 | 184 | 0.7971 | 0.0145 | 0.3731 | 0.1379 |
0.585 | 4.0 | 246 | 0.7619 | 0.0145 | 0.3567 | 0.1329 |
0.542 | 4.9919 | 307 | 0.8564 | 0.0145 | 0.3502 | 0.1345 |
0.4828 | 6.0 | 369 | 0.7153 | 0.0145 | 0.3899 | 0.1572 |
0.4398 | 6.9919 | 430 | 0.7300 | 0.0145 | 0.3568 | 0.1298 |
0.3804 | 8.0 | 492 | 0.8210 | 0.0145 | 0.3622 | 0.1358 |
0.35 | 8.9919 | 553 | 0.7800 | 0.0145 | 0.3536 | 0.1339 |
0.3053 | 10.0 | 615 | 0.7407 | 0.0145 | 0.3718 | 0.1387 |
0.2882 | 10.9919 | 676 | 0.8678 | 0.0145 | 0.3595 | 0.1370 |
0.2437 | 12.0 | 738 | 0.8548 | 0.0145 | 0.3744 | 0.1371 |
0.2283 | 12.9919 | 799 | 0.9142 | 0.0145 | 0.3768 | 0.1391 |
0.1932 | 14.0 | 861 | 1.1226 | 0.0145 | 0.3585 | 0.1348 |
0.1719 | 14.9919 | 922 | 1.2449 | 0.0145 | 0.3435 | 0.1293 |
0.1511 | 16.0 | 984 | 1.2415 | 0.0145 | 0.3693 | 0.1347 |
0.1496 | 16.9919 | 1045 | 1.0652 | 0.0145 | 0.3738 | 0.1422 |
0.1119 | 18.0 | 1107 | 1.1335 | 0.0145 | 0.3818 | 0.1416 |
0.0904 | 18.9919 | 1168 | 1.3077 | 0.0145 | 0.3608 | 0.1346 |
0.0583 | 20.0 | 1230 | 1.5964 | 0.0145 | 0.3537 | 0.1303 |
0.0454 | 20.9919 | 1291 | 1.4444 | 0.0145 | 0.3831 | 0.1393 |
0.0349 | 22.0 | 1353 | 1.6557 | 0.0145 | 0.3663 | 0.1334 |
0.0265 | 22.9919 | 1414 | 1.7123 | 0.0145 | 0.3540 | 0.1301 |
0.0171 | 24.0 | 1476 | 1.6974 | 0.0145 | 0.3680 | 0.1353 |
0.0148 | 24.9919 | 1537 | 1.9526 | 0.0145 | 0.3568 | 0.1309 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.1.0+cu118
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 2
Model tree for KasuleTrevor/w2v-bert-2.0-ln-afrivoice-10hr-v4
Base model
facebook/w2v-bert-2.0