Edit model card

wav2vec2-xls-r-300m-CV-Fleurs-lg-20hrs-v6

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0969
  • Wer: 0.5036
  • Cer: 0.1126

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.5591 1.0 2058 1.2786 0.9833 0.3692
1.2856 2.0 4116 0.9589 0.9331 0.2873
1.0792 3.0 6174 0.8599 0.9057 0.2580
0.9542 4.0 8232 0.7839 0.8875 0.2417
0.8534 5.0 10290 0.7093 0.8487 0.2188
0.7707 6.0 12348 0.6855 0.8030 0.2040
0.6966 7.0 14406 0.6720 0.7709 0.1911
0.6304 8.0 16464 0.6486 0.7499 0.1817
0.574 9.0 18522 0.6267 0.7318 0.1728
0.5269 10.0 20580 0.6314 0.7103 0.1673
0.4797 11.0 22638 0.6279 0.6922 0.1658
0.4442 12.0 24696 0.6025 0.6653 0.1584
0.4018 13.0 26754 0.6456 0.6677 0.1585
0.3727 14.0 28812 0.6448 0.6534 0.1540
0.3477 15.0 30870 0.6650 0.6516 0.1532
0.3192 16.0 32928 0.6561 0.6311 0.1491
0.3017 17.0 34986 0.6839 0.6424 0.1532
0.2817 18.0 37044 0.6746 0.6228 0.1466
0.2664 19.0 39102 0.7383 0.6303 0.1473
0.2533 20.0 41160 0.7165 0.6284 0.1478
0.2385 21.0 43218 0.7308 0.6176 0.1447
0.2249 22.0 45276 0.7467 0.6189 0.1431
0.2129 23.0 47334 0.7641 0.6078 0.1421
0.2097 24.0 49392 0.7705 0.6088 0.1420
0.1955 25.0 51450 0.8025 0.6091 0.1423
0.1887 26.0 53508 0.8143 0.6085 0.1406
0.1809 27.0 55566 0.8349 0.5978 0.1381
0.1727 28.0 57624 0.8418 0.5943 0.1377
0.1707 29.0 59682 0.8629 0.5998 0.1395
0.1622 30.0 61740 0.8416 0.5928 0.1378
0.1613 31.0 63798 0.8534 0.6018 0.1389
0.1532 32.0 65856 0.8314 0.5801 0.1345
0.1518 33.0 67914 0.8429 0.5840 0.1343
0.1463 34.0 69972 0.8393 0.5896 0.1362
0.1423 35.0 72030 0.8101 0.5766 0.1328
0.1359 36.0 74088 0.8567 0.5718 0.1309
0.135 37.0 76146 0.8705 0.5814 0.1334
0.131 38.0 78204 0.8780 0.5711 0.1327
0.1296 39.0 80262 0.8849 0.5748 0.1328
0.1244 40.0 82320 0.9377 0.5873 0.1334
0.1209 41.0 84378 0.9363 0.5661 0.1315
0.1191 42.0 86436 0.9145 0.5628 0.1302
0.1174 43.0 88494 0.9233 0.5655 0.1309
0.1165 44.0 90552 0.9279 0.5641 0.1301
0.1132 45.0 92610 0.8626 0.5589 0.1290
0.1113 46.0 94668 0.9245 0.5661 0.1297
0.1097 47.0 96726 0.9124 0.5620 0.1270
0.1059 48.0 98784 0.9121 0.5625 0.1291
0.1041 49.0 100842 0.9203 0.5623 0.1282
0.0998 50.0 102900 0.9612 0.5540 0.1278
0.099 51.0 104958 0.9638 0.5576 0.1290
0.0999 52.0 107016 0.9549 0.5500 0.1257
0.0957 53.0 109074 0.9682 0.5518 0.1264
0.0934 54.0 111132 0.9340 0.5430 0.1260
0.0895 55.0 113190 0.9747 0.5519 0.1242
0.0927 56.0 115248 0.9647 0.5443 0.1236
0.0894 57.0 117306 0.9140 0.5502 0.1245
0.0858 58.0 119364 0.9512 0.5385 0.1231
0.0879 59.0 121422 0.9763 0.5357 0.1227
0.0844 60.0 123480 0.9697 0.5475 0.1238
0.0828 61.0 125538 0.9820 0.5412 0.1228
0.0804 62.0 127596 0.9701 0.5379 0.1230
0.0811 63.0 129654 1.0090 0.5362 0.1227
0.0787 64.0 131712 0.9761 0.5388 0.1234
0.0757 65.0 133770 0.9881 0.5348 0.1227
0.0739 66.0 135828 0.9683 0.5420 0.1231
0.0735 67.0 137886 0.9996 0.5329 0.1203
0.073 68.0 139944 0.9929 0.5251 0.1195
0.0723 69.0 142002 1.0000 0.5313 0.1184
0.0691 70.0 144060 1.0202 0.5290 0.1189
0.0692 71.0 146118 1.0301 0.5409 0.1197
0.0683 72.0 148176 1.0201 0.5280 0.1188
0.0659 73.0 150234 1.0464 0.5247 0.1182
0.0663 74.0 152292 1.0014 0.5264 0.1190
0.0641 75.0 154350 0.9940 0.5210 0.1177
0.0631 76.0 156408 1.0384 0.5252 0.1188
0.0616 77.0 158466 1.0370 0.5249 0.1183
0.0608 78.0 160524 1.0293 0.5232 0.1176
0.0593 79.0 162582 1.0479 0.5279 0.1178
0.0585 80.0 164640 1.0432 0.5203 0.1171
0.0579 81.0 166698 1.0589 0.5231 0.1173
0.0567 82.0 168756 1.0262 0.5120 0.1148
0.0554 83.0 170814 1.0367 0.5124 0.1148
0.0553 84.0 172872 1.0286 0.5151 0.1164
0.052 85.0 174930 1.0567 0.5133 0.1151
0.0522 86.0 176988 1.0498 0.5137 0.1153
0.0517 87.0 179046 1.0574 0.5133 0.1152
0.0499 88.0 181104 1.0666 0.5137 0.1148
0.0473 89.0 183162 1.0774 0.5152 0.1145
0.0496 90.0 185220 1.0660 0.5118 0.1139
0.047 91.0 187278 1.0747 0.5117 0.1143
0.0463 92.0 189336 1.0818 0.5092 0.1137
0.0452 93.0 191394 1.0911 0.5137 0.1145
0.0455 94.0 193452 1.0794 0.5023 0.1135
0.0455 95.0 195510 1.0853 0.5047 0.1134
0.0452 96.0 197568 1.0856 0.5057 0.1134
0.0437 97.0 199626 1.0917 0.5020 0.1130
0.0442 98.0 201684 1.0903 0.5049 0.1130
0.044 99.0 203742 1.0979 0.5050 0.1128
0.0421 100.0 205800 1.0969 0.5036 0.1126

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
8
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-CV-Fleurs-lg-20hrs-v6

Finetuned
(439)
this model