tgrhn's picture
update model card README.md
bbfbf73
metadata
license: apache-2.0
tags:
  - generated_from_trainer
model-index:
  - name: wav2vec2-multiple-medical-2-1
    results: []

wav2vec2-multiple-medical-2-1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2340

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
5.7848 0.56 1500 4.7981
3.4426 1.12 3000 3.3244
3.1079 1.68 4500 3.0999
3.0034 2.24 6000 2.9647
2.0403 2.8 7500 1.5829
1.2797 3.36 9000 0.9095
1.052 3.92 10500 0.6498
0.8326 4.48 12000 0.5418
0.7443 5.04 13500 0.4615
0.6949 5.6 15000 0.4191
0.6096 6.16 16500 0.3817
0.5699 6.72 18000 0.3545
0.5718 7.28 19500 0.3439
0.5159 7.84 21000 0.3243
0.4808 8.4 22500 0.3112
0.4979 8.96 24000 0.2975
0.4271 9.52 25500 0.2948
0.4364 10.08 27000 0.2818
0.4205 10.64 28500 0.2770
0.418 11.2 30000 0.2747
0.3915 11.76 31500 0.2695
0.4121 12.32 33000 0.2596
0.4057 12.88 34500 0.2627
0.363 13.44 36000 0.2617
0.3767 14.0 37500 0.2567
0.3804 14.56 39000 0.2512
0.3537 15.12 40500 0.2505
0.3195 15.68 42000 0.2508
0.311 16.24 43500 0.2523
0.3089 16.8 45000 0.2462
0.3121 17.36 46500 0.2463
0.3549 17.92 48000 0.2479
0.3111 18.48 49500 0.2422
0.3228 19.04 51000 0.2414
0.2936 19.6 52500 0.2415
0.28 20.16 54000 0.2411
0.3174 20.72 55500 0.2354
0.2735 21.28 57000 0.2335
0.3498 21.84 58500 0.2352
0.2958 22.4 60000 0.2341
0.3009 22.96 61500 0.2328
0.2869 23.52 63000 0.2352
0.2644 24.08 64500 0.2343
0.2692 24.64 66000 0.2346
0.3376 25.2 67500 0.2339
0.2522 25.76 69000 0.2340

Framework versions

  • Transformers 4.26.1
  • Pytorch 2.4.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.13.3