--- library_name: transformers license: cc-by-nc-4.0 base_model: mms-meta/mms-zeroshot-300m tags: - automatic-speech-recognition - genbed - mms - generated_from_trainer metrics: - wer model-index: - name: mms-zeroshot-300m-genbed-m-model results: [] --- # mms-zeroshot-300m-genbed-m-model This model is a fine-tuned version of [mms-meta/mms-zeroshot-300m](https://huggingface.co/mms-meta/mms-zeroshot-300m) on the GENBED - BEM dataset. It achieves the following results on the evaluation set: - Loss: 0.3383 - Wer: 0.4761 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 100 - num_epochs: 30.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-------:|:----:|:---------------:|:------:| | No log | 0.5510 | 200 | 2.3107 | 1.0 | | No log | 1.1019 | 400 | 0.4860 | 0.6380 | | 2.8087 | 1.6529 | 600 | 0.4381 | 0.5988 | | 2.8087 | 2.2039 | 800 | 0.4281 | 0.5855 | | 0.6248 | 2.7548 | 1000 | 0.4056 | 0.5656 | | 0.6248 | 3.3058 | 1200 | 0.4017 | 0.5513 | | 0.6248 | 3.8567 | 1400 | 0.3904 | 0.5617 | | 0.578 | 4.4077 | 1600 | 0.3776 | 0.5306 | | 0.578 | 4.9587 | 1800 | 0.3722 | 0.5178 | | 0.5343 | 5.5096 | 2000 | 0.3659 | 0.5152 | | 0.5343 | 6.0606 | 2200 | 0.3614 | 0.5150 | | 0.5343 | 6.6116 | 2400 | 0.3573 | 0.5016 | | 0.5153 | 7.1625 | 2600 | 0.3625 | 0.5026 | | 0.5153 | 7.7135 | 2800 | 0.3545 | 0.4867 | | 0.4935 | 8.2645 | 3000 | 0.3506 | 0.4814 | | 0.4935 | 8.8154 | 3200 | 0.3482 | 0.4922 | | 0.4935 | 9.3664 | 3400 | 0.3383 | 0.4761 | | 0.4731 | 9.9174 | 3600 | 0.3419 | 0.4645 | | 0.4731 | 10.4683 | 3800 | 0.3391 | 0.4686 | | 0.4619 | 11.0193 | 4000 | 0.3402 | 0.4576 | ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.20.0