DewiBrynJones's picture
End of training
9e6382b verified
|
raw
history blame
10.9 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - automatic-speech-recognition
  - DewiBrynJones/banc-trawsgrifiadau-bangor-clean-with-ccv
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xlsr-53-ft-btb-ccv-cy
    results: []

wav2vec2-xlsr-53-ft-btb-ccv-cy

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the DEWIBRYNJONES/BANC-TRAWSGRIFIADAU-BANGOR-CLEAN-WITH-CCV - DEFAULT dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4511
  • Wer: 0.3591

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 30000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.0079 200 3.1893 1.0
No log 0.0157 400 2.7802 1.0
4.719 0.0236 600 1.4221 0.8877
4.719 0.0314 800 1.2274 0.8224
1.0441 0.0393 1000 1.1095 0.7887
1.0441 0.0472 1200 1.0914 0.7549
1.0441 0.0550 1400 1.0177 0.7355
0.8033 0.0629 1600 0.9907 0.7233
0.8033 0.0707 1800 0.9761 0.7145
0.7227 0.0786 2000 0.9555 0.6903
0.7227 0.0864 2200 0.8995 0.6748
0.7227 0.0943 2400 0.8897 0.6666
0.6794 0.1022 2600 0.8826 0.6560
0.6794 0.1100 2800 0.8745 0.6446
0.6513 0.1179 3000 0.8450 0.6437
0.6513 0.1257 3200 0.8596 0.6511
0.6513 0.1336 3400 0.8598 0.6376
0.6147 0.1415 3600 0.8516 0.6375
0.6147 0.1493 3800 0.8252 0.6100
0.6092 0.1572 4000 0.8580 0.6823
0.6092 0.1650 4200 0.8205 0.6136
0.6092 0.1729 4400 0.8033 0.6385
0.5928 0.1808 4600 0.7928 0.6005
0.5928 0.1886 4800 0.7911 0.5924
0.5681 0.1965 5000 0.7969 0.5944
0.5681 0.2043 5200 0.7933 0.5899
0.5681 0.2122 5400 0.7830 0.6013
0.5806 0.2200 5600 0.7703 0.5789
0.5806 0.2279 5800 0.7666 0.5898
0.5608 0.2358 6000 0.7580 0.5695
0.5608 0.2436 6200 0.7479 0.5651
0.5608 0.2515 6400 0.7639 0.5847
0.5333 0.2593 6600 0.7297 0.5676
0.5333 0.2672 6800 0.7441 0.5590
0.5406 0.2751 7000 0.7405 0.5491
0.5406 0.2829 7200 0.7238 0.5529
0.5406 0.2908 7400 0.7328 0.5544
0.535 0.2986 7600 0.7263 0.5599
0.535 0.3065 7800 0.7421 0.5594
0.5195 0.3144 8000 0.7435 0.5544
0.5195 0.3222 8200 0.7187 0.5424
0.5195 0.3301 8400 0.6977 0.5353
0.5023 0.3379 8600 0.6950 0.5386
0.5023 0.3458 8800 0.7155 0.5451
0.5106 0.3536 9000 0.6857 0.5379
0.5106 0.3615 9200 0.6848 0.5329
0.5106 0.3694 9400 0.6732 0.5202
0.4968 0.3772 9600 0.6839 0.5275
0.4968 0.3851 9800 0.6767 0.5198
0.4824 0.3929 10000 0.6718 0.5335
0.4824 0.4008 10200 0.6593 0.5175
0.4824 0.4087 10400 0.6799 0.5174
0.48 0.4165 10600 0.6662 0.5129
0.48 0.4244 10800 0.6619 0.5006
0.4693 0.4322 11000 0.6576 0.5199
0.4693 0.4401 11200 0.6406 0.5019
0.4693 0.4480 11400 0.6408 0.5066
0.4691 0.4558 11600 0.6476 0.5019
0.4691 0.4637 11800 0.6423 0.4946
0.4444 0.4715 12000 0.6374 0.4976
0.4444 0.4794 12200 0.6312 0.4961
0.4444 0.4872 12400 0.6170 0.4819
0.4474 0.4951 12600 0.6301 0.4933
0.4474 0.5030 12800 0.6253 0.4862
0.4471 0.5108 13000 0.6220 0.4849
0.4471 0.5187 13200 0.6201 0.4853
0.4471 0.5265 13400 0.6168 0.4848
0.4323 0.5344 13600 0.6173 0.4771
0.4323 0.5423 13800 0.6032 0.4656
0.4575 0.5501 14000 0.6097 0.4678
0.4575 0.5580 14200 0.5971 0.4674
0.4575 0.5658 14400 0.5977 0.4698
0.4395 0.5737 14600 0.6057 0.4734
0.4395 0.5816 14800 0.5827 0.4574
0.4119 0.5894 15000 0.5946 0.4640
0.4119 0.5973 15200 0.6023 0.4771
0.4119 0.6051 15400 0.6129 0.4727
0.4125 0.6130 15600 0.5902 0.4584
0.4125 0.6208 15800 0.5955 0.4654
0.4039 0.6287 16000 0.5955 0.4595
0.4039 0.6366 16200 0.5789 0.4497
0.4039 0.6444 16400 0.5779 0.4630
0.3969 0.6523 16600 0.5677 0.4551
0.3969 0.6601 16800 0.5869 0.4606
0.3923 0.6680 17000 0.5710 0.4502
0.3923 0.6759 17200 0.5640 0.4474
0.3923 0.6837 17400 0.5842 0.4498
0.386 0.6916 17600 0.5597 0.4440
0.386 0.6994 17800 0.5621 0.4381
0.3851 0.7073 18000 0.5665 0.4346
0.3851 0.7152 18200 0.5573 0.4356
0.3851 0.7230 18400 0.5548 0.4344
0.369 0.7309 18600 0.5617 0.4364
0.369 0.7387 18800 0.5596 0.4394
0.3738 0.7466 19000 0.5492 0.4292
0.3738 0.7545 19200 0.5478 0.4372
0.3738 0.7623 19400 0.5376 0.4287
0.368 0.7702 19600 0.5282 0.4193
0.368 0.7780 19800 0.5348 0.4251
0.3629 0.7859 20000 0.5368 0.4313
0.3629 0.7937 20200 0.5551 0.4412
0.3629 0.8016 20400 0.5252 0.4105
0.3638 0.8095 20600 0.5242 0.4117
0.3638 0.8173 20800 0.5233 0.4166
0.3512 0.8252 21000 0.5243 0.4161
0.3512 0.8330 21200 0.5150 0.4123
0.3512 0.8409 21400 0.5089 0.4080
0.3536 0.8488 21600 0.5154 0.4090
0.3536 0.8566 21800 0.5162 0.4092
0.3464 0.8645 22000 0.5098 0.4053
0.3464 0.8723 22200 0.5070 0.4023
0.3464 0.8802 22400 0.5070 0.4071
0.3377 0.8881 22600 0.5028 0.3967
0.3377 0.8959 22800 0.5036 0.3978
0.3272 0.9038 23000 0.5021 0.3954
0.3272 0.9116 23200 0.5033 0.3985
0.3272 0.9195 23400 0.4984 0.3972
0.319 0.9273 23600 0.4929 0.3924
0.319 0.9352 23800 0.4941 0.4013
0.3184 0.9431 24000 0.4856 0.3874
0.3184 0.9509 24200 0.4892 0.3914
0.3184 0.9588 24400 0.4860 0.3814
0.3091 0.9666 24600 0.4825 0.3834
0.3091 0.9745 24800 0.4784 0.3867
0.3154 0.9824 25000 0.4751 0.3808
0.3154 0.9902 25200 0.4779 0.3849
0.3154 0.9981 25400 0.4773 0.3808
0.312 1.0059 25600 0.4777 0.3758
0.312 1.0138 25800 0.4752 0.3821
0.2651 1.0217 26000 0.4701 0.3775
0.2651 1.0295 26200 0.4701 0.3761
0.2651 1.0374 26400 0.4718 0.3776
0.2627 1.0452 26600 0.4638 0.3730
0.2627 1.0531 26800 0.4677 0.3720
0.2427 1.0609 27000 0.4643 0.3699
0.2427 1.0688 27200 0.4602 0.3713
0.2427 1.0767 27400 0.4664 0.3703
0.2464 1.0845 27600 0.4609 0.3677
0.2464 1.0924 27800 0.4614 0.3687
0.2537 1.1002 28000 0.4555 0.3655
0.2537 1.1081 28200 0.4560 0.3645
0.2537 1.1160 28400 0.4543 0.3626
0.2313 1.1238 28600 0.4540 0.3631
0.2313 1.1317 28800 0.4536 0.3626
0.2451 1.1395 29000 0.4529 0.3617
0.2451 1.1474 29200 0.4530 0.3598
0.2451 1.1553 29400 0.4515 0.3592
0.2445 1.1631 29600 0.4514 0.3590
0.2445 1.1710 29800 0.4514 0.3589
0.2364 1.1788 30000 0.4511 0.3591

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1