wavlm-base-plus-ft-cv3
This model is a fine-tuned version of microsoft/wavlm-base-plus on the "mozilla-foundation/common_voice_3_0 english" dataset: "train" and "validation" splits are used for training while "test" split is used for validation. It achieves the following results on the validation set:
- Loss: 0.4365
- Wer: 0.1801
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 11
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
5.3448 | 0.05 | 500 | 3.2621 | 1.0 |
2.9322 | 0.1 | 1000 | 2.8551 | 1.0 |
1.7692 | 0.16 | 1500 | 1.2653 | 0.7447 |
1.012 | 0.21 | 2000 | 0.9008 | 0.5601 |
0.7129 | 0.26 | 2500 | 0.7684 | 0.4762 |
0.6424 | 0.31 | 3000 | 0.6282 | 0.4276 |
0.6518 | 0.37 | 3500 | 0.5888 | 0.3916 |
0.5142 | 0.42 | 4000 | 0.5428 | 0.3727 |
0.48 | 0.47 | 4500 | 0.5614 | 0.3549 |
0.4523 | 0.52 | 5000 | 0.5334 | 0.3487 |
0.4315 | 0.58 | 5500 | 0.5376 | 0.3317 |
0.4292 | 0.63 | 6000 | 0.4939 | 0.3172 |
0.4229 | 0.68 | 6500 | 0.4977 | 0.3117 |
0.3837 | 0.73 | 7000 | 0.4899 | 0.3056 |
0.385 | 0.78 | 7500 | 0.4571 | 0.2864 |
0.4155 | 0.84 | 8000 | 0.4635 | 0.2866 |
0.3768 | 0.89 | 8500 | 0.4390 | 0.2843 |
0.3864 | 0.94 | 9000 | 0.4529 | 0.2764 |
0.387 | 0.99 | 9500 | 0.4870 | 0.2755 |
0.341 | 1.05 | 10000 | 0.4498 | 0.2696 |
0.3334 | 1.1 | 10500 | 0.4355 | 0.2600 |
0.3039 | 1.15 | 11000 | 0.4634 | 0.2716 |
0.3101 | 1.2 | 11500 | 0.4615 | 0.2582 |
0.4343 | 1.25 | 12000 | 0.4510 | 0.2574 |
0.3002 | 1.31 | 12500 | 0.4313 | 0.2590 |
0.3419 | 1.36 | 13000 | 0.4121 | 0.2493 |
0.3162 | 1.41 | 13500 | 0.4423 | 0.2498 |
0.3134 | 1.46 | 14000 | 0.4260 | 0.2506 |
0.2963 | 1.52 | 14500 | 0.4272 | 0.2556 |
0.3297 | 1.57 | 15000 | 0.4413 | 0.2487 |
0.3199 | 1.62 | 15500 | 0.4260 | 0.2432 |
0.3368 | 1.67 | 16000 | 0.4164 | 0.2464 |
0.2981 | 1.73 | 16500 | 0.4111 | 0.2402 |
0.2887 | 1.78 | 17000 | 0.4372 | 0.2460 |
0.3058 | 1.83 | 17500 | 0.4161 | 0.2397 |
0.2877 | 1.88 | 18000 | 0.4046 | 0.2386 |
0.2904 | 1.93 | 18500 | 0.4108 | 0.2399 |
0.2851 | 1.99 | 19000 | 0.4196 | 0.2385 |
0.2451 | 2.04 | 19500 | 0.4096 | 0.2406 |
0.259 | 2.09 | 20000 | 0.4437 | 0.2374 |
0.2681 | 2.14 | 20500 | 0.4226 | 0.2357 |
0.4371 | 2.2 | 21000 | 0.4301 | 0.2356 |
0.2468 | 2.25 | 21500 | 0.4431 | 0.2326 |
0.2687 | 2.3 | 22000 | 0.4218 | 0.2401 |
0.2571 | 2.35 | 22500 | 0.4131 | 0.2337 |
0.2541 | 2.41 | 23000 | 0.4105 | 0.2312 |
0.2663 | 2.46 | 23500 | 0.4228 | 0.2327 |
0.2777 | 2.51 | 24000 | 0.3960 | 0.2254 |
0.2659 | 2.56 | 24500 | 0.4074 | 0.2289 |
0.2519 | 2.61 | 25000 | 0.4220 | 0.2363 |
0.2607 | 2.67 | 25500 | 0.3912 | 0.2253 |
0.2749 | 2.72 | 26000 | 0.4017 | 0.2214 |
0.2431 | 2.77 | 26500 | 0.3879 | 0.2181 |
0.2557 | 2.82 | 27000 | 0.4011 | 0.2268 |
0.2662 | 2.88 | 27500 | 0.3884 | 0.2241 |
0.2649 | 2.93 | 28000 | 0.3987 | 0.2233 |
0.2382 | 2.98 | 28500 | 0.3777 | 0.2215 |
0.2198 | 3.03 | 29000 | 0.3952 | 0.2177 |
0.2281 | 3.09 | 29500 | 0.4067 | 0.2213 |
0.2178 | 3.14 | 30000 | 0.4178 | 0.2192 |
0.222 | 3.19 | 30500 | 0.4327 | 0.2208 |
0.2262 | 3.24 | 31000 | 0.4028 | 0.2212 |
0.2256 | 3.29 | 31500 | 0.4065 | 0.2181 |
0.2255 | 3.35 | 32000 | 0.3782 | 0.2139 |
0.2364 | 3.4 | 32500 | 0.4443 | 0.2119 |
0.2209 | 3.45 | 33000 | 0.4089 | 0.2177 |
0.2051 | 3.5 | 33500 | 0.3886 | 0.2154 |
0.2242 | 3.56 | 34000 | 0.3810 | 0.2133 |
0.2151 | 3.61 | 34500 | 0.4005 | 0.2127 |
0.2341 | 3.66 | 35000 | 0.3899 | 0.2165 |
0.202 | 3.71 | 35500 | 0.3846 | 0.2121 |
0.2107 | 3.76 | 36000 | 0.3859 | 0.2146 |
0.2237 | 3.82 | 36500 | 0.3993 | 0.2141 |
0.2189 | 3.87 | 37000 | 0.3842 | 0.2113 |
0.2124 | 3.92 | 37500 | 0.3919 | 0.2118 |
0.4017 | 3.97 | 38000 | 0.3882 | 0.2086 |
0.1946 | 4.03 | 38500 | 0.4008 | 0.2121 |
0.1919 | 4.08 | 39000 | 0.3939 | 0.2129 |
0.1797 | 4.13 | 39500 | 0.3958 | 0.2115 |
0.184 | 4.18 | 40000 | 0.3942 | 0.2086 |
0.1987 | 4.24 | 40500 | 0.3959 | 0.2092 |
0.1919 | 4.29 | 41000 | 0.4250 | 0.2093 |
0.2038 | 4.34 | 41500 | 0.3970 | 0.2060 |
0.1879 | 4.39 | 42000 | 0.3978 | 0.2109 |
0.1852 | 4.44 | 42500 | 0.4065 | 0.2091 |
0.2014 | 4.5 | 43000 | 0.4069 | 0.2054 |
0.2011 | 4.55 | 43500 | 0.4247 | 0.2099 |
0.1937 | 4.6 | 44000 | 0.3754 | 0.2091 |
0.1878 | 4.65 | 44500 | 0.3891 | 0.2070 |
0.2011 | 4.71 | 45000 | 0.3714 | 0.2030 |
0.1958 | 4.76 | 45500 | 0.3994 | 0.2066 |
0.1907 | 4.81 | 46000 | 0.4061 | 0.2080 |
0.1859 | 4.86 | 46500 | 0.3899 | 0.2056 |
0.1894 | 4.92 | 47000 | 0.3808 | 0.2055 |
0.3276 | 4.97 | 47500 | 0.3936 | 0.2051 |
0.3513 | 5.02 | 48000 | 0.4028 | 0.2041 |
0.1654 | 5.07 | 48500 | 0.3929 | 0.2032 |
0.1622 | 5.12 | 49000 | 0.4067 | 0.2029 |
0.1659 | 5.18 | 49500 | 0.4058 | 0.2007 |
0.1779 | 5.23 | 50000 | 0.4085 | 0.2031 |
0.1731 | 5.28 | 50500 | 0.3895 | 0.2009 |
0.1761 | 5.33 | 51000 | 0.3973 | 0.2022 |
0.1741 | 5.39 | 51500 | 0.4116 | 0.2021 |
0.1735 | 5.44 | 52000 | 0.4152 | 0.2038 |
0.1627 | 5.49 | 52500 | 0.4078 | 0.2003 |
0.1728 | 5.54 | 53000 | 0.4088 | 0.2022 |
0.179 | 5.6 | 53500 | 0.3828 | 0.1998 |
0.1692 | 5.65 | 54000 | 0.3903 | 0.1980 |
0.174 | 5.7 | 54500 | 0.4185 | 0.1993 |
0.1763 | 5.75 | 55000 | 0.3937 | 0.1976 |
0.1792 | 5.8 | 55500 | 0.3767 | 0.1966 |
0.1799 | 5.86 | 56000 | 0.3970 | 0.1994 |
0.1918 | 5.91 | 56500 | 0.3954 | 0.1981 |
0.1836 | 5.96 | 57000 | 0.3984 | 0.1969 |
0.1708 | 6.01 | 57500 | 0.3917 | 0.1956 |
0.1524 | 6.07 | 58000 | 0.3922 | 0.1977 |
0.1567 | 6.12 | 58500 | 0.4108 | 0.1955 |
0.1518 | 6.17 | 59000 | 0.4349 | 0.1968 |
0.1587 | 6.22 | 59500 | 0.3963 | 0.1988 |
0.1563 | 6.27 | 60000 | 0.4235 | 0.1997 |
0.154 | 6.33 | 60500 | 0.4026 | 0.1951 |
0.1636 | 6.38 | 61000 | 0.4359 | 0.2031 |
0.1641 | 6.43 | 61500 | 0.4115 | 0.1972 |
0.1604 | 6.48 | 62000 | 0.4166 | 0.1972 |
0.1579 | 6.54 | 62500 | 0.4264 | 0.1965 |
0.1552 | 6.59 | 63000 | 0.4047 | 0.2007 |
0.1461 | 6.64 | 63500 | 0.4263 | 0.2011 |
0.1522 | 6.69 | 64000 | 0.4222 | 0.1970 |
0.1624 | 6.75 | 64500 | 0.4318 | 0.1971 |
0.1474 | 6.8 | 65000 | 0.4265 | 0.1961 |
0.1495 | 6.85 | 65500 | 0.4316 | 0.1940 |
0.1509 | 6.9 | 66000 | 0.4297 | 0.1965 |
0.1479 | 6.95 | 66500 | 0.4232 | 0.1966 |
0.1462 | 7.01 | 67000 | 0.4090 | 0.1946 |
0.1498 | 7.06 | 67500 | 0.4197 | 0.1939 |
0.1436 | 7.11 | 68000 | 0.4215 | 0.1956 |
0.1378 | 7.16 | 68500 | 0.4345 | 0.1968 |
0.3082 | 7.22 | 69000 | 0.4364 | 0.1972 |
0.1386 | 7.27 | 69500 | 0.4284 | 0.1949 |
0.1441 | 7.32 | 70000 | 0.4019 | 0.1953 |
0.1624 | 7.37 | 70500 | 0.4175 | 0.1951 |
0.1454 | 7.43 | 71000 | 0.4224 | 0.1922 |
0.1408 | 7.48 | 71500 | 0.4128 | 0.1961 |
0.1525 | 7.53 | 72000 | 0.4200 | 0.1946 |
0.1459 | 7.58 | 72500 | 0.4166 | 0.1949 |
0.1485 | 7.63 | 73000 | 0.4102 | 0.1947 |
0.148 | 7.69 | 73500 | 0.4237 | 0.1948 |
0.1478 | 7.74 | 74000 | 0.4104 | 0.1928 |
0.14 | 7.79 | 74500 | 0.4027 | 0.1928 |
0.1473 | 7.84 | 75000 | 0.4034 | 0.1907 |
0.1394 | 7.9 | 75500 | 0.3823 | 0.1923 |
0.1324 | 7.95 | 76000 | 0.3987 | 0.1899 |
0.1459 | 8.0 | 76500 | 0.4003 | 0.1907 |
0.1373 | 8.05 | 77000 | 0.4204 | 0.1925 |
0.1303 | 8.1 | 77500 | 0.4218 | 0.1907 |
0.1346 | 8.16 | 78000 | 0.4091 | 0.1882 |
0.2947 | 8.21 | 78500 | 0.4156 | 0.1890 |
0.1324 | 8.26 | 79000 | 0.4280 | 0.1888 |
0.132 | 8.31 | 79500 | 0.4136 | 0.1873 |
0.1377 | 8.37 | 80000 | 0.4099 | 0.1915 |
0.3045 | 8.42 | 80500 | 0.4201 | 0.1900 |
0.1372 | 8.47 | 81000 | 0.4161 | 0.1876 |
0.1377 | 8.52 | 81500 | 0.4107 | 0.1869 |
0.1374 | 8.58 | 82000 | 0.4188 | 0.1875 |
0.1301 | 8.63 | 82500 | 0.4306 | 0.1860 |
0.1386 | 8.68 | 83000 | 0.4131 | 0.1862 |
0.1292 | 8.73 | 83500 | 0.3997 | 0.1871 |
0.1276 | 8.78 | 84000 | 0.4237 | 0.1873 |
0.1377 | 8.84 | 84500 | 0.4284 | 0.1889 |
0.1338 | 8.89 | 85000 | 0.4205 | 0.1861 |
0.1284 | 8.94 | 85500 | 0.4380 | 0.1875 |
0.1471 | 8.99 | 86000 | 0.4238 | 0.1895 |
0.1186 | 9.05 | 86500 | 0.4128 | 0.1875 |
0.1222 | 9.1 | 87000 | 0.4267 | 0.1864 |
0.1229 | 9.15 | 87500 | 0.4169 | 0.1842 |
0.1259 | 9.2 | 88000 | 0.4327 | 0.1861 |
0.1281 | 9.26 | 88500 | 0.4188 | 0.1877 |
0.1247 | 9.31 | 89000 | 0.4212 | 0.1852 |
0.1248 | 9.36 | 89500 | 0.4172 | 0.1863 |
0.1232 | 9.41 | 90000 | 0.4173 | 0.1858 |
0.3255 | 9.46 | 90500 | 0.4225 | 0.1851 |
0.1243 | 9.52 | 91000 | 0.4290 | 0.1849 |
0.1266 | 9.57 | 91500 | 0.4186 | 0.1842 |
0.1257 | 9.62 | 92000 | 0.4364 | 0.1860 |
0.1181 | 9.67 | 92500 | 0.4294 | 0.1852 |
0.1202 | 9.73 | 93000 | 0.4222 | 0.1836 |
0.1264 | 9.78 | 93500 | 0.4191 | 0.1856 |
0.1243 | 9.83 | 94000 | 0.4237 | 0.1856 |
0.1164 | 9.88 | 94500 | 0.4281 | 0.1848 |
0.1283 | 9.94 | 95000 | 0.4332 | 0.1845 |
0.123 | 9.99 | 95500 | 0.4316 | 0.1839 |
0.1232 | 10.04 | 96000 | 0.4313 | 0.1844 |
0.1206 | 10.09 | 96500 | 0.4303 | 0.1840 |
0.1145 | 10.14 | 97000 | 0.4299 | 0.1822 |
0.1265 | 10.2 | 97500 | 0.4266 | 0.1822 |
0.1147 | 10.25 | 98000 | 0.4322 | 0.1844 |
0.1122 | 10.3 | 98500 | 0.4251 | 0.1830 |
0.1101 | 10.35 | 99000 | 0.4297 | 0.1830 |
0.1225 | 10.41 | 99500 | 0.4244 | 0.1842 |
0.1177 | 10.46 | 100000 | 0.4343 | 0.1826 |
0.1157 | 10.51 | 100500 | 0.4228 | 0.1827 |
0.1215 | 10.56 | 101000 | 0.4285 | 0.1814 |
0.276 | 10.61 | 101500 | 0.4268 | 0.1820 |
0.111 | 10.67 | 102000 | 0.4288 | 0.1836 |
0.1164 | 10.72 | 102500 | 0.4283 | 0.1825 |
0.111 | 10.77 | 103000 | 0.4198 | 0.1819 |
0.1135 | 10.82 | 103500 | 0.4333 | 0.1818 |
0.1196 | 10.88 | 104000 | 0.4239 | 0.1817 |
0.1176 | 10.93 | 104500 | 0.4252 | 0.1819 |
0.117 | 10.98 | 105000 | 0.4317 | 0.1820 |
0.1166 | 11.03 | 105500 | 0.4307 | 0.1815 |
0.1118 | 11.09 | 106000 | 0.4379 | 0.1821 |
0.1116 | 11.14 | 106500 | 0.4363 | 0.1812 |
0.1098 | 11.19 | 107000 | 0.4328 | 0.1816 |
0.1134 | 11.24 | 107500 | 0.4284 | 0.1811 |
0.1104 | 11.29 | 108000 | 0.4365 | 0.1801 |
Framework versions
- Transformers 4.19.2
- Pytorch 1.11.0+cu113
- Datasets 2.2.0
- Tokenizers 0.12.1
- Downloads last month
- 3
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Dataset used to train danieleV9H/wavlm-base-plus-ft-cv3
Evaluation results
- Test WER on LibriSpeech (clean)test set self-reported8.06