Edit model card

wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-10hrs-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2971
  • Wer: 0.3556
  • Cer: 0.1044

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
7.0482 0.9986 360 2.9509 1.0 1.0
5.5647 2.0 721 2.1843 1.0 0.6661
3.4187 2.9986 1081 1.3719 0.9273 0.3550
2.5943 4.0 1442 1.1633 0.8977 0.2827
2.1874 4.9986 1802 0.9978 0.7963 0.2396
1.8944 6.0 2163 0.8786 0.7200 0.2105
1.6523 6.9986 2523 0.8251 0.6626 0.1915
1.4683 8.0 2884 0.7958 0.6460 0.1890
1.3358 8.9986 3244 0.7706 0.6021 0.1754
1.1904 10.0 3605 0.7385 0.5992 0.1751
1.0935 10.9986 3965 0.7722 0.5620 0.1655
1.0084 12.0 4326 0.7509 0.5715 0.1697
0.9096 12.9986 4686 0.7622 0.5253 0.1503
0.8377 14.0 5047 0.7561 0.5150 0.1496
0.7773 14.9986 5407 0.7909 0.5108 0.1511
0.7337 16.0 5768 0.7987 0.5079 0.1481
0.6904 16.9986 6128 0.8391 0.4989 0.1426
0.6381 18.0 6489 0.8506 0.5009 0.1422
0.6163 18.9986 6849 0.8754 0.4815 0.1392
0.5792 20.0 7210 0.9556 0.4788 0.1373
0.5665 20.9986 7570 0.8177 0.4739 0.1371
0.5387 22.0 7931 0.8381 0.4704 0.1375
0.5125 22.9986 8291 0.8817 0.4589 0.1323
0.4893 24.0 8652 0.9227 0.4590 0.1344
0.4702 24.9986 9012 1.0008 0.4470 0.1294
0.4583 26.0 9373 0.9699 0.4488 0.1298
0.448 26.9986 9733 1.0021 0.4525 0.1320
0.4358 28.0 10094 0.9996 0.4459 0.1313
0.42 28.9986 10454 0.9673 0.4434 0.1273
0.3913 30.0 10815 1.0123 0.4410 0.1272
0.4023 30.9986 11175 1.0258 0.4308 0.1258
0.3793 32.0 11536 1.0015 0.4334 0.1269
0.3716 32.9986 11896 1.0227 0.4370 0.1273
0.3668 34.0 12257 1.0355 0.4283 0.1251
0.3442 34.9986 12617 1.0020 0.4264 0.1235
0.3392 36.0 12978 1.0422 0.4214 0.1238
0.3229 36.9986 13338 1.0545 0.4219 0.1236
0.311 38.0 13699 1.0842 0.4227 0.1242
0.3046 38.9986 14059 1.0755 0.4168 0.1208
0.2966 40.0 14420 1.0766 0.4206 0.1239
0.289 40.9986 14780 1.0696 0.4119 0.1209
0.2801 42.0 15141 1.0653 0.4127 0.1215
0.2845 42.9986 15501 1.1241 0.4138 0.1215
0.2795 44.0 15862 1.1176 0.4077 0.1195
0.2742 44.9986 16222 1.0823 0.4092 0.1198
0.2772 46.0 16583 1.0684 0.4111 0.1214
0.2607 46.9986 16943 1.1543 0.4055 0.1191
0.2565 48.0 17304 1.1207 0.4042 0.1186
0.2397 48.9986 17664 1.1324 0.4023 0.1184
0.2439 50.0 18025 1.0794 0.4058 0.1192
0.2393 50.9986 18385 1.1105 0.4002 0.1177
0.2283 52.0 18746 1.1724 0.3938 0.1162
0.226 52.9986 19106 1.1660 0.3975 0.1175
0.2272 54.0 19467 1.1530 0.3959 0.1162
0.2169 54.9986 19827 1.1406 0.3951 0.1150
0.2186 56.0 20188 1.1512 0.3952 0.1164
0.2221 56.9986 20548 1.1636 0.3945 0.1159
0.2113 58.0 20909 1.1598 0.3935 0.1151
0.2033 58.9986 21269 1.1667 0.3929 0.1154
0.201 60.0 21630 1.1330 0.3902 0.1142
0.2049 60.9986 21990 1.1746 0.3905 0.1138
0.2043 62.0 22351 1.1990 0.3869 0.1133
0.1945 62.9986 22711 1.1931 0.3900 0.1138
0.1876 64.0 23072 1.1713 0.3833 0.1131
0.1842 64.9986 23432 1.1645 0.3832 0.1127
0.1842 66.0 23793 1.1445 0.3844 0.1129
0.1853 66.9986 24153 1.1917 0.3853 0.1125
0.1745 68.0 24514 1.1785 0.3796 0.1121
0.1703 68.9986 24874 1.1752 0.3815 0.1114
0.1711 70.0 25235 1.1820 0.3784 0.1107
0.1687 70.9986 25595 1.1864 0.3788 0.1105
0.1632 72.0 25956 1.2335 0.3758 0.1097
0.1613 72.9986 26316 1.1948 0.3753 0.1099
0.1551 74.0 26677 1.2681 0.3767 0.1098
0.1606 74.9986 27037 1.2648 0.3738 0.1098
0.1445 76.0 27398 1.2243 0.3739 0.1089
0.1562 76.9986 27758 1.2221 0.3718 0.1089
0.1457 78.0 28119 1.2586 0.3709 0.1081
0.1431 78.9986 28479 1.2694 0.3731 0.1093
0.138 80.0 28840 1.2734 0.3707 0.1092
0.1382 80.9986 29200 1.2558 0.3668 0.1078
0.1389 82.0 29561 1.2251 0.3683 0.1088
0.1365 82.9986 29921 1.2232 0.3663 0.1081
0.13 84.0 30282 1.2646 0.3663 0.1079
0.1302 84.9986 30642 1.2837 0.3649 0.1073
0.1262 86.0 31003 1.2755 0.3636 0.1063
0.1276 86.9986 31363 1.2917 0.3637 0.1068
0.1217 88.0 31724 1.2952 0.3623 0.1064
0.1228 88.9986 32084 1.2751 0.3602 0.1057
0.1196 90.0 32445 1.2764 0.3606 0.1056
0.1214 90.9986 32805 1.2727 0.3603 0.1057
0.1212 92.0 33166 1.2687 0.3582 0.1051
0.1157 92.9986 33526 1.2731 0.3576 0.1050
0.1134 94.0 33887 1.2842 0.3580 0.1052
0.1119 94.9986 34247 1.3028 0.3572 0.1051
0.1132 96.0 34608 1.2819 0.3562 0.1047
0.1117 96.9986 34968 1.2993 0.3561 0.1047
0.1078 98.0 35329 1.3051 0.3547 0.1045
0.1088 98.9986 35689 1.2980 0.3554 0.1043
0.1066 99.8613 36000 1.2971 0.3556 0.1044

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
9
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-10hrs-v1

Finetuned
(439)
this model