Gummybear05's picture
End of training
78aec05 verified
|
raw
history blame
2.85 kB
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
model-index:
  - name: wav2vec2-E10_freq_speed_pause
    results: []

wav2vec2-E10_freq_speed_pause

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.1732
  • Cer: 46.9807

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
34.0782 0.1289 200 5.1516 100.0
5.0781 0.2579 400 4.7666 100.0
4.9696 0.3868 600 4.6840 100.0
4.906 0.5158 800 4.6904 100.0
4.7963 0.6447 1000 4.6462 100.0
4.7565 0.7737 1200 4.6231 100.0
4.7372 0.9026 1400 4.5947 100.0
4.6765 1.0316 1600 4.5636 100.0
4.6162 1.1605 1800 4.5811 99.9354
4.4587 1.2895 2000 4.4613 96.3346
4.1506 1.4184 2200 4.2476 95.7766
3.8599 1.5474 2400 3.8368 73.1262
3.34 1.6763 2600 3.3885 64.8085
3.1038 1.8053 2800 3.1215 61.0491
2.9058 1.9342 3000 2.9318 58.3705
2.7231 2.0632 3200 2.7934 57.7244
2.5775 2.1921 3400 2.6023 53.1661
2.4413 2.3211 3600 2.5756 54.5759
2.3662 2.4500 3800 2.3468 50.9810
2.2503 2.5790 4000 2.2820 49.2892
2.231 2.7079 4200 2.2980 49.1483
2.1522 2.8369 4400 2.1792 47.4389
2.1532 2.9658 4600 2.1732 46.9807

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1