File size: 4,635 Bytes
bd44102
 
7ecde34
 
 
 
 
 
 
 
 
bd44102
 
7ecde34
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: xlsr-aiish-nose
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# xlsr-aiish-nose

This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0000
- Wer: 0.3068

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 132
- num_epochs: 100
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Wer    |
|:-------------:|:-------:|:-----:|:---------------:|:------:|
| 4.497         | 1.9704  | 200   | 2.6013          | 1.0    |
| 1.5579        | 3.9409  | 400   | 0.1774          | 0.5513 |
| 0.2962        | 5.9113  | 600   | 0.0348          | 0.3826 |
| 0.1312        | 7.8818  | 800   | 0.0160          | 0.3325 |
| 0.1006        | 9.8522  | 1000  | 0.0058          | 0.3166 |
| 0.0806        | 11.8227 | 1200  | 0.0047          | 0.3117 |
| 0.0746        | 13.7931 | 1400  | 0.0014          | 0.3105 |
| 0.0548        | 15.7635 | 1600  | 0.0014          | 0.3093 |
| 0.0468        | 17.7340 | 1800  | 0.0009          | 0.3093 |
| 0.053         | 19.7044 | 2000  | 0.0299          | 0.3117 |
| 0.0516        | 21.6749 | 2200  | 0.0034          | 0.3105 |
| 0.0286        | 23.6453 | 2400  | 0.0003          | 0.3068 |
| 0.0376        | 25.6158 | 2600  | 0.0004          | 0.3068 |
| 0.0293        | 27.5862 | 2800  | 0.0003          | 0.3068 |
| 0.0257        | 29.5567 | 3000  | 0.0002          | 0.3068 |
| 0.0212        | 31.5271 | 3200  | 0.0003          | 0.3093 |
| 0.0249        | 33.4975 | 3400  | 0.0007          | 0.3068 |
| 0.0192        | 35.4680 | 3600  | 0.0002          | 0.3081 |
| 0.0221        | 37.4384 | 3800  | 0.0008          | 0.3142 |
| 0.0181        | 39.4089 | 4000  | 0.0003          | 0.3105 |
| 0.0192        | 41.3793 | 4200  | 0.0010          | 0.3093 |
| 0.0219        | 43.3498 | 4400  | 0.0010          | 0.3117 |
| 0.0139        | 45.3202 | 4600  | 0.0010          | 0.3105 |
| 0.0125        | 47.2906 | 4800  | 0.0001          | 0.3068 |
| 0.0107        | 49.2611 | 5000  | 0.0002          | 0.3068 |
| 0.0119        | 51.2315 | 5200  | 0.0128          | 0.3240 |
| 0.0104        | 53.2020 | 5400  | 0.0001          | 0.3068 |
| 0.0093        | 55.1724 | 5600  | 0.0001          | 0.3142 |
| 0.0085        | 57.1429 | 5800  | 0.0001          | 0.3068 |
| 0.0087        | 59.1133 | 6000  | 0.0001          | 0.3068 |
| 0.0082        | 61.0837 | 6200  | 0.0001          | 0.3068 |
| 0.0079        | 63.0542 | 6400  | 0.0001          | 0.3068 |
| 0.0083        | 65.0246 | 6600  | 0.0001          | 0.3068 |
| 0.0056        | 66.9951 | 6800  | 0.0001          | 0.3068 |
| 0.0099        | 68.9655 | 7000  | 0.0001          | 0.3068 |
| 0.0052        | 70.9360 | 7200  | 0.0000          | 0.3068 |
| 0.0051        | 72.9064 | 7400  | 0.0000          | 0.3068 |
| 0.0061        | 74.8768 | 7600  | 0.0001          | 0.3068 |
| 0.0035        | 76.8473 | 7800  | 0.0000          | 0.3081 |
| 0.0051        | 78.8177 | 8000  | 0.0000          | 0.3068 |
| 0.0037        | 80.7882 | 8200  | 0.0000          | 0.3068 |
| 0.0036        | 82.7586 | 8400  | 0.0000          | 0.3068 |
| 0.0041        | 84.7291 | 8600  | 0.0002          | 0.3068 |
| 0.0031        | 86.6995 | 8800  | 0.0006          | 0.3068 |
| 0.0026        | 88.6700 | 9000  | 0.0000          | 0.3068 |
| 0.0017        | 90.6404 | 9200  | 0.0000          | 0.3068 |
| 0.0023        | 92.6108 | 9400  | 0.0000          | 0.3068 |
| 0.0018        | 94.5813 | 9600  | 0.0000          | 0.3068 |
| 0.002         | 96.5517 | 9800  | 0.0000          | 0.3068 |
| 0.0018        | 98.5222 | 10000 | 0.0000          | 0.3068 |


### Framework versions

- Transformers 4.45.0.dev0
- Pytorch 2.4.0
- Datasets 2.21.0
- Tokenizers 0.19.1