Edit model card

wav2vec2-large-xls-r-300m

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.6299
  • Cer: 69.5709

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
56.9189 0.13 200 17.3335 100.0
24.9477 0.26 400 5.2898 93.4002
22.4883 0.39 600 5.1855 91.7503
23.6577 0.52 800 5.5828 93.8097
12.0508 0.64 1000 5.5876 92.9788
6.9102 0.77 1200 5.2180 93.1153
5.2937 0.9 1400 4.6411 93.6613
4.6224 1.03 1600 4.7614 93.7088
4.6148 1.16 1800 4.8569 93.0916
4.6129 1.29 2000 4.6641 92.5693
4.5837 1.42 2200 4.7341 93.5664
4.5465 1.55 2400 4.7580 93.3646
4.5042 1.68 2600 4.8342 93.4180
4.4535 1.81 2800 4.8323 93.4952
4.3942 1.93 3000 4.7891 92.1538
4.347 2.06 3200 4.5461 92.4150
4.3064 2.19 3400 4.7049 92.0114
4.301 2.32 3600 4.6162 92.3438
4.2107 2.45 3800 4.5264 92.3497
4.1768 2.58 4000 4.4067 87.1090
4.1395 2.71 4200 4.5684 90.6048
4.0568 2.84 4400 4.4142 87.7856
4.0007 2.97 4600 4.2861 87.7619
3.9426 3.09 4800 4.2750 87.4236
3.8207 3.22 5000 4.2217 84.2424
3.7678 3.35 5200 4.1129 81.1324
3.7522 3.48 5400 4.0891 83.2512
3.6666 3.61 5600 4.0328 80.0107
3.587 3.74 5800 4.0241 80.2837
3.5184 3.87 6000 4.0116 81.8209
3.449 4.0 6200 3.9031 76.4437
3.3503 4.13 6400 3.8643 75.8680
3.2541 4.26 6600 3.8508 72.4850
3.1987 4.38 6800 3.8425 71.8203
3.1435 4.51 7000 3.7289 71.6719
3.0583 4.64 7200 3.6855 70.1703
3.0443 4.77 7400 3.6365 69.6540
3.0508 4.9 7600 3.6299 69.5709

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.15.0
Downloads last month
7
Safetensors
Model size
317M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for syp1229/wav2vec2-large-xls-r-300m

Finetuned
(439)
this model