Edit model card

wav2vec2-1b-Elderly6

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3776
  • Cer: 10.3853

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
9.195 0.2580 200 3.1758 50.8870
1.6332 0.5161 400 1.4825 34.3280
1.1542 0.7741 600 1.0327 24.9765
0.9689 1.0322 800 0.9404 22.8090
0.8334 1.2902 1000 0.7965 19.8250
0.7615 1.5483 1200 0.7948 19.2845
0.7499 1.8063 1400 0.7138 18.9203
0.6314 2.0643 1600 0.7851 19.9659
0.5657 2.3224 1800 0.6697 17.9335
0.5533 2.5804 2000 0.7101 18.7441
0.5326 2.8385 2200 0.6084 16.3240
0.4909 3.0965 2400 0.5517 15.5898
0.4061 3.3546 2600 0.5080 13.9803
0.3904 3.6126 2800 0.4536 12.5352
0.3628 3.8707 3000 0.4773 12.7232
0.3109 4.1287 3200 0.4460 12.0242
0.2835 4.3867 3400 0.4071 11.3311
0.2507 4.6448 3600 0.3907 10.6497
0.248 4.9028 3800 0.3776 10.3853

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.3.1.post100
  • Datasets 2.19.1
  • Tokenizers 0.20.1
Downloads last month
3
Safetensors
Model size
964M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Gummybear05/wav2vec2-1b-Elderly6

Finetuned
(62)
this model