Bajiyo's picture
Upload tokenizer
a053dcb verified
|
raw
history blame
3.29 kB
metadata
base_model: facebook/w2v-bert-2.0
datasets:
  - common_voice_17_0
library_name: transformers
license: mit
metrics:
  - wer
tags:
  - generated_from_trainer
model-index:
  - name: w2v-bert-2_6_datasets
    results:
      - task:
          type: automatic-speech-recognition
          name: Automatic Speech Recognition
        dataset:
          name: common_voice_17_0
          type: common_voice_17_0
          config: ml
          split: validation
          args: ml
        metrics:
          - type: wer
            value: 0.43922053819981444
            name: Wer

w2v-bert-2_6_datasets

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the common_voice_17_0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5077
  • Wer: 0.4392

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.1114 0.4038 600 0.6364 0.6514
0.1782 0.8075 1200 0.5620 0.6127
0.1374 1.2113 1800 0.4943 0.5654
0.1156 1.6151 2400 0.4415 0.5376
0.1068 2.0188 3000 0.4187 0.5249
0.0838 2.4226 3600 0.4778 0.5320
0.0834 2.8264 4200 0.4186 0.5091
0.0703 3.2301 4800 0.4538 0.5363
0.0636 3.6339 5400 0.4287 0.5314
0.0609 4.0377 6000 0.4013 0.4989
0.0462 4.4415 6600 0.4053 0.4964
0.047 4.8452 7200 0.4289 0.4766
0.0377 5.2490 7800 0.3875 0.4933
0.0352 5.6528 8400 0.3906 0.4881
0.033 6.0565 9000 0.4192 0.4667
0.0243 6.4603 9600 0.4113 0.4723
0.0244 6.8641 10200 0.4393 0.4708
0.0189 7.2678 10800 0.4255 0.4630
0.0167 7.6716 11400 0.4219 0.4646
0.0157 8.0754 12000 0.4398 0.4429
0.0107 8.4791 12600 0.4546 0.4507
0.0095 8.8829 13200 0.4949 0.4426
0.0072 9.2867 13800 0.4972 0.4473
0.0059 9.6904 14400 0.5077 0.4392

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1