Sunbird - MMS Finetuned Models
This model is a fine-tuned version of facebook/mms-1b-all on the None dataset.
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
To Add
Results
Language Adapter | WER (%) | CER (%) | Additional Details |
---|---|---|---|
Luganda (Lug) | |||
Lug-Base | 0.25 | ||
Lug+5Gram LM | |||
Lug+3Gram LM | |||
Lug+English Combined | 0.12 | ||
Acholi (Ach) | |||
Ach-Base | 0.34 | ||
Ach+3Gram LM | |||
Ach+5Gram LM | |||
Ach+English Combined | 0.18 | ||
Lugbara (Lgg) | |||
Lgg-Base | |||
Lgg+3Gram LM | |||
Lgg+5Gram LM | |||
Lgg+English Combined | 0.25 | ||
Teso (Teo) | |||
Teo-Base | 0.39 | ||
Teo+3Gram LM | |||
Teo+5Gram LM | |||
Teo+English Combined | 0.29 | ||
Nyankore (Nyn) | |||
Nyn-Base | 0.48 | ||
Nyn+3Gram LM | |||
Nyn+5Gram LM | |||
Nyn+English Combined | 0.29 |
Note: LM stands for Language Model. The +3Gram LM
and +5Gram LM
suffixes indicate models enhanced with trigram and five-gram language models, respectively.
Framework versions
- Transformers 4.32.0.dev0
- Pytorch 2.0.1+cu117
- Datasets 2.13.0
- Tokenizers 0.13.3
- Downloads last month
- 195
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Sunbird/asr-mms-salt
Base model
facebook/mms-1b-all