KasuleTrevor's picture
End of training
0f41f39 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-large-xls-r-300m-lg-cv-1hr-vr
    results: []

wav2vec2-large-xls-r-300m-lg-cv-1hr-vr

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4686
  • Model Preparation Time: 0.0074
  • Wer: 0.9247
  • Cer: 0.2572

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.01
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
15.4113 1.0 18 13.3832 0.0074 1.0 1.0
8.4071 2.0 36 5.0022 0.0074 1.0 1.0
4.3781 3.0 54 3.8902 0.0074 1.0 1.0
3.7341 4.0 72 3.5371 0.0074 1.0 1.0
3.4503 5.0 90 3.3256 0.0074 1.0 1.0
3.268 6.0 108 3.1846 0.0074 1.0 1.0
3.1464 7.0 126 3.0891 0.0074 1.0 1.0
3.0601 8.0 144 3.0319 0.0074 1.0 1.0
3.0119 9.0 162 2.9856 0.0074 1.0 1.0
2.9756 10.0 180 2.9687 0.0074 1.0 1.0
2.9535 11.0 198 2.9495 0.0074 1.0 1.0
2.9376 12.0 216 2.9651 0.0074 1.0 1.0
2.9256 13.0 234 2.9529 0.0074 1.0 1.0
2.9159 14.0 252 2.9260 0.0074 1.0 1.0
2.9046 15.0 270 2.9180 0.0074 1.0 1.0
2.8973 16.0 288 2.9079 0.0074 1.0 1.0
2.8891 17.0 306 2.9027 0.0074 1.0 0.9906
2.8793 18.0 324 2.8907 0.0074 1.0 0.9882
2.8616 19.0 342 2.8788 0.0074 1.0 0.9625
2.8416 20.0 360 2.8412 0.0074 1.0 0.9559
2.7586 21.0 378 2.7118 0.0074 1.0 0.8715
2.6084 22.0 396 2.5263 0.0074 1.0 0.8093
2.4068 23.0 414 2.3225 0.0074 1.0005 0.7330
2.1542 24.0 432 2.0424 0.0074 1.0025 0.5160
1.8099 25.0 450 1.7306 0.0074 1.0203 0.4231
1.464 26.0 468 1.4744 0.0074 0.9682 0.3198
1.1766 27.0 486 1.3177 0.0074 0.9476 0.2827
0.9688 28.0 504 1.2492 0.0074 0.9482 0.2704
0.8301 29.0 522 1.2105 0.0074 0.9837 0.2689
0.7473 30.0 540 1.1642 0.0074 0.9340 0.2571
0.6502 31.0 558 1.1645 0.0074 0.9358 0.2542
0.5898 32.0 576 1.1443 0.0074 0.9389 0.2534
0.5306 33.0 594 1.1486 0.0074 0.9363 0.2589
0.5048 34.0 612 1.1764 0.0074 0.9586 0.2548
0.4661 35.0 630 1.1654 0.0074 0.9414 0.2595
0.441 36.0 648 1.1358 0.0074 0.9401 0.2481
0.4155 37.0 666 1.1788 0.0074 0.9776 0.2563
0.3924 38.0 684 1.1449 0.0074 0.9477 0.2517
0.3733 39.0 702 1.1898 0.0074 0.9537 0.2516
0.3603 40.0 720 1.1532 0.0074 0.9327 0.2468
0.3459 41.0 738 1.1934 0.0074 0.9692 0.2614
0.3238 42.0 756 1.1646 0.0074 0.9384 0.2512
0.3156 43.0 774 1.1860 0.0074 0.9586 0.2570
0.3081 44.0 792 1.2123 0.0074 0.9340 0.2545
0.2857 45.0 810 1.2501 0.0074 0.9499 0.2566
0.2776 46.0 828 1.1802 0.0074 0.9281 0.2545
0.2702 47.0 846 1.2074 0.0074 0.9286 0.2567
0.2693 48.0 864 1.1991 0.0074 0.9628 0.2490
0.2467 49.0 882 1.2253 0.0074 0.9289 0.2538
0.2453 50.0 900 1.2126 0.0074 0.9335 0.2506
0.2455 51.0 918 1.1971 0.0074 0.9383 0.2502
0.2353 52.0 936 1.2120 0.0074 0.9175 0.2400
0.2242 53.0 954 1.1997 0.0074 0.9272 0.2405
0.2142 54.0 972 1.2027 0.0074 0.9284 0.2449
0.2166 55.0 990 1.2304 0.0074 0.9251 0.2489
0.209 56.0 1008 1.2486 0.0074 0.9472 0.2487
0.1989 57.0 1026 1.2205 0.0074 0.9301 0.2456
0.1988 58.0 1044 1.2305 0.0074 0.9208 0.2428
0.1951 59.0 1062 1.2508 0.0074 0.9402 0.2449
0.1906 60.0 1080 1.2638 0.0074 0.9333 0.2474
0.1869 61.0 1098 1.2625 0.0074 0.9249 0.2438
0.1812 62.0 1116 1.2536 0.0074 0.9170 0.2473
0.1822 63.0 1134 1.2459 0.0074 0.9315 0.2434
0.1732 64.0 1152 1.2313 0.0074 0.9128 0.2435
0.1723 65.0 1170 1.2817 0.0074 0.9130 0.2497
0.1748 66.0 1188 1.2766 0.0074 0.9093 0.2388
0.1667 67.0 1206 1.2803 0.0074 0.9271 0.2498
0.1652 68.0 1224 1.2886 0.0074 0.9312 0.2439
0.1566 69.0 1242 1.2814 0.0074 0.9104 0.2414
0.1512 70.0 1260 1.2557 0.0074 0.9226 0.2432
0.1455 71.0 1278 1.2875 0.0074 0.9178 0.2455
0.1548 72.0 1296 1.3325 0.0074 0.9661 0.2468
0.158 73.0 1314 1.2506 0.0074 0.9303 0.2474
0.1579 74.0 1332 1.2974 0.0074 0.9341 0.2465
0.1492 75.0 1350 1.2893 0.0074 0.9199 0.2490
0.1471 76.0 1368 1.3097 0.0074 0.9194 0.2466
0.1454 77.0 1386 1.3065 0.0074 0.9099 0.2428
0.1456 78.0 1404 1.2618 0.0074 0.9074 0.2406
0.1515 79.0 1422 1.3326 0.0074 0.9213 0.2503
0.1452 80.0 1440 1.3146 0.0074 0.9082 0.2384
0.1326 81.0 1458 1.3145 0.0074 0.9089 0.2412
0.1368 82.0 1476 1.2851 0.0074 0.9039 0.2433
0.1314 83.0 1494 1.3079 0.0074 0.9099 0.2410
0.1272 84.0 1512 1.3156 0.0074 0.9081 0.2399
0.1359 85.0 1530 1.3318 0.0074 0.9235 0.2435
0.1288 86.0 1548 1.3015 0.0074 0.9057 0.2390
0.1243 87.0 1566 1.3271 0.0074 0.9111 0.2405
0.1334 88.0 1584 1.3117 0.0074 0.9120 0.2410
0.1248 89.0 1602 1.3005 0.0074 0.9193 0.2436
0.1163 90.0 1620 1.3611 0.0074 0.9195 0.2416
0.1233 91.0 1638 1.3070 0.0074 0.9117 0.2416
0.1264 92.0 1656 1.3325 0.0074 0.9109 0.2384
0.1249 93.0 1674 1.3528 0.0074 0.9150 0.2411
0.1189 94.0 1692 1.3673 0.0074 0.9134 0.2434
0.1197 95.0 1710 1.3391 0.0074 0.9097 0.2449
0.1212 96.0 1728 1.3404 0.0074 0.9178 0.2400
0.1168 97.0 1746 1.3651 0.0074 0.9153 0.2408
0.1133 98.0 1764 1.3744 0.0074 0.9236 0.2430
0.113 99.0 1782 1.3383 0.0074 0.9000 0.2358
0.1244 100.0 1800 1.3553 0.0074 0.9028 0.2356
0.1116 101.0 1818 1.3768 0.0074 0.9136 0.2392
0.1078 102.0 1836 1.3629 0.0074 0.9116 0.2409
0.1117 103.0 1854 1.3564 0.0074 0.9276 0.2410
0.1063 104.0 1872 1.3593 0.0074 0.9146 0.2395
0.1083 105.0 1890 1.3516 0.0074 0.9053 0.2406
0.1096 106.0 1908 1.3819 0.0074 0.9338 0.2439
0.1013 107.0 1926 1.3928 0.0074 0.9266 0.2433
0.1123 108.0 1944 1.3754 0.0074 0.9200 0.2422
0.1107 109.0 1962 1.3677 0.0074 0.9108 0.2401
0.1061 110.0 1980 1.3938 0.0074 0.9139 0.2397
0.1034 111.0 1998 1.3845 0.0074 0.9349 0.2457
0.1042 112.0 2016 1.3915 0.0074 0.9184 0.2426
0.1077 113.0 2034 1.3786 0.0074 0.9124 0.2421
0.0987 114.0 2052 1.3875 0.0074 0.9183 0.2442
0.1038 115.0 2070 1.3943 0.0074 0.9266 0.2447
0.0981 116.0 2088 1.3904 0.0074 0.9224 0.2438
0.0964 117.0 2106 1.4113 0.0074 0.9144 0.2426
0.0978 118.0 2124 1.4044 0.0074 0.9184 0.2427
0.0999 119.0 2142 1.3869 0.0074 0.9257 0.2449
0.1005 120.0 2160 1.3821 0.0074 0.9192 0.2440
0.1026 121.0 2178 1.3913 0.0074 0.9188 0.2457
0.0948 122.0 2196 1.3892 0.0074 0.9229 0.2458
0.0968 123.0 2214 1.4042 0.0074 0.9192 0.2449
0.0897 124.0 2232 1.4130 0.0074 0.9172 0.2448
0.1031 125.0 2250 1.4104 0.0074 0.9189 0.2431
0.094 126.0 2268 1.4125 0.0074 0.9182 0.2434
0.0897 127.0 2286 1.4098 0.0074 0.9135 0.2436
0.1005 128.0 2304 1.4042 0.0074 0.9157 0.2432
0.0923 129.0 2322 1.4027 0.0074 0.9175 0.2429
0.0994 130.0 2340 1.4018 0.0074 0.9202 0.2442
0.1006 131.0 2358 1.4007 0.0074 0.9138 0.2429
0.0927 132.0 2376 1.4001 0.0074 0.9160 0.2424
0.0907 133.0 2394 1.4063 0.0074 0.9150 0.2416
0.0964 134.0 2412 1.4093 0.0074 0.9133 0.2421
0.0923 135.0 2430 1.4123 0.0074 0.9146 0.2423
0.0991 136.0 2448 1.4134 0.0074 0.9147 0.2432
0.093 137.0 2466 1.4116 0.0074 0.9133 0.2427
0.1007 138.0 2484 1.4096 0.0074 0.9141 0.2427
0.0921 139.0 2502 1.4086 0.0074 0.9145 0.2426
0.0929 140.0 2520 1.4094 0.0074 0.9142 0.2422
0.0996 141.0 2538 1.4108 0.0074 0.9162 0.2426
0.0925 142.0 2556 1.4120 0.0074 0.9164 0.2425
0.0928 143.0 2574 1.4127 0.0074 0.9166 0.2424
0.0899 144.0 2592 1.4129 0.0074 0.9168 0.2426
0.0921 145.0 2610 1.4128 0.0074 0.9161 0.2425
0.0942 146.0 2628 1.4129 0.0074 0.9169 0.2426
0.0915 147.0 2646 1.4128 0.0074 0.9169 0.2429
0.0876 148.0 2664 1.4127 0.0074 0.9170 0.2427
0.0942 149.0 2682 1.4127 0.0074 0.9172 0.2427
0.0864 150.0 2700 1.4127 0.0074 0.9169 0.2427

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.20.0
  • Tokenizers 0.19.1