w2v2-xlsr-eng-zh-cs / README.md
katyayego's picture
Update README.md
3a6adbb verified
|
raw
history blame
4.72 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-large-xlsr-demo
    results: []

wav2vec2-large-xlsr-demo

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on ASCEND (a Mandarin-English codeswitching dataset). It achieves the following results on the evaluation set:

  • Loss: 1.6751
  • Wer: 0.7846

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
10.5108 0.5701 500 15.4314 1.0
5.61 1.1403 1000 11.3143 1.0
5.546 1.7104 1500 9.6388 1.0
5.1105 2.2805 2000 5.9376 1.0
4.9007 2.8506 2500 5.7582 1.0
4.5876 3.4208 3000 6.7190 1.0
4.3145 3.9909 3500 4.6527 1.0
3.5332 4.5610 4000 3.3622 1.0011
2.9071 5.1311 4500 2.7400 1.0176
2.5077 5.7013 5000 2.8219 0.9460
2.4145 6.2714 5500 2.4336 0.9570
2.2432 6.8415 6000 2.1500 0.9169
2.1376 7.4116 6500 2.1445 0.8930
2.0841 7.9818 7000 2.1312 0.8864
1.8288 8.5519 7500 1.9040 0.8728
1.6863 9.1220 8000 1.8913 0.8434
1.7453 9.6921 8500 2.1214 0.8507
1.6896 10.2623 9000 1.8329 0.8548
1.6063 10.8324 9500 1.8248 0.8386
1.3838 11.4025 10000 1.7811 0.8379
1.5255 11.9726 10500 2.3148 0.8390
1.4269 12.5428 11000 2.1530 0.8184
1.3452 13.1129 11500 1.7208 0.8221
1.35 13.6830 12000 1.8269 0.8290
1.3656 14.2531 12500 1.6902 0.8313
1.2036 14.8233 13000 2.0816 0.8206
1.2144 15.3934 13500 1.7623 0.8103
1.1648 15.9635 14000 1.7197 0.8154
1.1341 16.5336 14500 1.7560 0.8110
1.0716 17.1038 15000 1.7750 0.8099
1.1187 17.6739 15500 1.7946 0.8180
1.0633 18.2440 16000 1.7877 0.7996
1.0069 18.8141 16500 1.8482 0.8243
0.9703 19.3843 17000 1.6073 0.7960
1.0122 19.9544 17500 1.7191 0.8099
0.9993 20.5245 18000 1.7208 0.7956
0.9861 21.0946 18500 1.6628 0.7949
0.9621 21.6648 19000 1.7685 0.7930
0.8936 22.2349 19500 1.7232 0.8026
0.888 22.8050 20000 1.7204 0.8015
0.9027 23.3751 20500 1.7844 0.7923
0.8808 23.9453 21000 1.7159 0.7945
0.8652 24.5154 21500 1.6887 0.7934
0.7545 25.0855 22000 1.6633 0.7937
0.7664 25.6556 22500 1.6745 0.7919
0.7518 26.2258 23000 1.7122 0.7930
0.8475 26.7959 23500 1.6901 0.7868
0.7527 27.3660 24000 1.6937 0.7835
0.7531 27.9361 24500 1.6835 0.7820
0.7686 28.5063 25000 1.6734 0.7901
0.7525 29.0764 25500 1.6766 0.7868
0.7765 29.6465 26000 1.6751 0.7846

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0
  • Datasets 2.19.2
  • Tokenizers 0.19.1