File size: 1,905 Bytes
26c3a37 3bf8aaa 8874b3a 894e602 8874b3a bbf05f6 3bf8aaa 8874b3a 3bf8aaa 8874b3a 3bf8aaa 8874b3a 26c3a37 bbf05f6 3bf8aaa 894e602 3bf8aaa bbf05f6 3bf8aaa bbf05f6 3bf8aaa bbf05f6 3bf8aaa bbf05f6 894e602 3bf8aaa 894e602 bbf05f6 3bf8aaa |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 |
---
base_model: openai/whisper-small
datasets:
- fleurs
license: apache-2.0
metrics:
- wer
tags:
- generated_from_trainer
model-index:
- name: whisper-small-wolof
results:
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: fleurs
type: fleurs
config: wo_sn
split: test
args: wo_sn
metrics:
- type: wer
value: 0.9217902350813744
name: Wer
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-wolof
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the fleurs dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8726
- Wer: 0.9218
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 4.5232 | 0.9790 | 35 | 3.5807 | 1.3809 |
| 3.7127 | 1.9860 | 71 | 2.4567 | 1.1817 |
| 2.2111 | 2.9371 | 105 | 1.8726 | 0.9218 |
### Framework versions
- Transformers 4.42.4
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
|