File size: 11,088 Bytes
b2893c4
 
c24aa97
 
 
 
 
 
 
 
 
b2893c4
 
c24aa97
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
---
library_name: transformers
license: mit
base_model: facebook/w2v-bert-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: w2v-bert-2.0-lg-cv-1hr-v2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# w2v-bert-2.0-lg-cv-1hr-v2

This model is a fine-tuned version of [facebook/w2v-bert-2.0](https://huggingface.co/facebook/w2v-bert-2.0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.8417
- Model Preparation Time: 0.0129
- Wer: 0.9997
- Cer: 0.9914

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Model Preparation Time | Wer    | Cer    |
|:-------------:|:-------:|:----:|:---------------:|:----------------------:|:------:|:------:|
| 15.3055       | 0.9859  | 35   | 12.2381         | 0.0129                 | 1.0    | 1.0    |
| 9.6208        | 2.0     | 71   | 8.3440          | 0.0129                 | 1.0    | 1.0    |
| 8.5028        | 2.9859  | 106  | 7.9784          | 0.0129                 | 1.0    | 1.0    |
| 7.9601        | 4.0     | 142  | 7.7040          | 0.0129                 | 1.0    | 1.0    |
| 7.9111        | 4.9859  | 177  | 7.4474          | 0.0129                 | 1.0    | 1.0    |
| 7.4259        | 6.0     | 213  | 7.1874          | 0.0129                 | 1.0    | 1.0    |
| 7.3711        | 6.9859  | 248  | 6.9404          | 0.0129                 | 1.0    | 1.0    |
| 6.9121        | 8.0     | 284  | 6.6929          | 0.0129                 | 1.0    | 1.0    |
| 6.8465        | 8.9859  | 319  | 6.4528          | 0.0129                 | 1.0    | 1.0    |
| 6.4091        | 10.0    | 355  | 6.2112          | 0.0129                 | 1.0    | 1.0    |
| 6.3427        | 10.9859 | 390  | 5.9794          | 0.0129                 | 1.0    | 1.0    |
| 5.9281        | 12.0    | 426  | 5.7489          | 0.0129                 | 1.0    | 1.0    |
| 5.861         | 12.9859 | 461  | 5.5291          | 0.0129                 | 1.0    | 1.0    |
| 5.4728        | 14.0    | 497  | 5.3136          | 0.0129                 | 1.0    | 1.0    |
| 5.4055        | 14.9859 | 532  | 5.1116          | 0.0129                 | 1.0    | 1.0    |
| 5.05          | 16.0    | 568  | 4.9106          | 0.0129                 | 1.0    | 1.0    |
| 4.9891        | 16.9859 | 603  | 4.7271          | 0.0129                 | 1.0    | 1.0    |
| 4.6647        | 18.0    | 639  | 4.5480          | 0.0129                 | 1.0    | 1.0    |
| 4.6156        | 18.9859 | 674  | 4.3846          | 0.0129                 | 1.0    | 1.0    |
| 4.3257        | 20.0    | 710  | 4.2293          | 0.0129                 | 1.0    | 1.0    |
| 4.2913        | 20.9859 | 745  | 4.0908          | 0.0129                 | 1.0    | 1.0    |
| 4.0311        | 22.0    | 781  | 3.9577          | 0.0129                 | 1.0    | 1.0    |
| 4.0132        | 22.9859 | 816  | 3.8405          | 0.0129                 | 1.0    | 1.0    |
| 3.7827        | 24.0    | 852  | 3.7315          | 0.0129                 | 1.0    | 1.0    |
| 3.7818        | 24.9859 | 887  | 3.6348          | 0.0129                 | 1.0    | 1.0    |
| 3.581         | 26.0    | 923  | 3.5459          | 0.0129                 | 1.0    | 1.0    |
| 3.5949        | 26.9859 | 958  | 3.4699          | 0.0129                 | 1.0    | 1.0    |
| 3.4195        | 28.0    | 994  | 3.3998          | 0.0129                 | 1.0    | 1.0    |
| 3.4464        | 28.9859 | 1029 | 3.3396          | 0.0129                 | 1.0    | 1.0    |
| 3.2914        | 30.0    | 1065 | 3.2848          | 0.0129                 | 1.0    | 1.0    |
| 3.3323        | 30.9859 | 1100 | 3.2404          | 0.0129                 | 1.0    | 1.0    |
| 3.1943        | 32.0    | 1136 | 3.1985          | 0.0129                 | 1.0    | 1.0    |
| 3.2449        | 32.9859 | 1171 | 3.1625          | 0.0129                 | 1.0    | 1.0    |
| 3.1197        | 34.0    | 1207 | 3.1302          | 0.0129                 | 1.0    | 1.0    |
| 3.1765        | 34.9859 | 1242 | 3.1066          | 0.0129                 | 1.0    | 1.0    |
| 3.0618        | 36.0    | 1278 | 3.0819          | 0.0129                 | 1.0    | 1.0    |
| 3.1256        | 36.9859 | 1313 | 3.0686          | 0.0129                 | 1.0    | 1.0    |
| 3.0218        | 38.0    | 1349 | 3.0477          | 0.0129                 | 1.0    | 1.0    |
| 3.09          | 38.9859 | 1384 | 3.0354          | 0.0129                 | 1.0    | 1.0    |
| 2.9895        | 40.0    | 1420 | 3.0255          | 0.0129                 | 1.0    | 1.0    |
| 3.0632        | 40.9859 | 1455 | 3.0127          | 0.0129                 | 1.0    | 1.0    |
| 2.9671        | 42.0    | 1491 | 3.0028          | 0.0129                 | 1.0    | 1.0    |
| 3.0415        | 42.9859 | 1526 | 2.9959          | 0.0129                 | 1.0    | 1.0    |
| 2.9499        | 44.0    | 1562 | 2.9881          | 0.0129                 | 1.0    | 1.0    |
| 3.0269        | 44.9859 | 1597 | 2.9858          | 0.0129                 | 1.0    | 1.0    |
| 2.9369        | 46.0    | 1633 | 2.9776          | 0.0129                 | 1.0    | 1.0    |
| 3.0154        | 46.9859 | 1668 | 2.9727          | 0.0129                 | 1.0    | 1.0    |
| 2.9269        | 48.0    | 1704 | 2.9696          | 0.0129                 | 1.0    | 1.0    |
| 3.0057        | 48.9859 | 1739 | 2.9655          | 0.0129                 | 1.0    | 1.0    |
| 2.9185        | 50.0    | 1775 | 2.9613          | 0.0129                 | 1.0    | 1.0    |
| 2.9982        | 50.9859 | 1810 | 2.9593          | 0.0129                 | 1.0    | 1.0    |
| 2.9112        | 52.0    | 1846 | 2.9555          | 0.0129                 | 1.0    | 1.0    |
| 2.9912        | 52.9859 | 1881 | 2.9532          | 0.0129                 | 1.0    | 1.0    |
| 2.9047        | 54.0    | 1917 | 2.9496          | 0.0129                 | 1.0    | 1.0    |
| 2.9844        | 54.9859 | 1952 | 2.9486          | 0.0129                 | 1.0    | 1.0    |
| 2.8984        | 56.0    | 1988 | 2.9454          | 0.0129                 | 1.0    | 1.0    |
| 2.9786        | 56.9859 | 2023 | 2.9435          | 0.0129                 | 1.0    | 1.0    |
| 2.8928        | 58.0    | 2059 | 2.9391          | 0.0129                 | 1.0    | 1.0    |
| 2.9716        | 58.9859 | 2094 | 2.9357          | 0.0129                 | 1.0    | 1.0    |
| 2.8834        | 60.0    | 2130 | 2.9296          | 0.0129                 | 1.0    | 1.0    |
| 2.9603        | 60.9859 | 2165 | 2.9241          | 0.0129                 | 1.0    | 1.0    |
| 2.87          | 62.0    | 2201 | 2.9152          | 0.0129                 | 1.0    | 1.0    |
| 2.9421        | 62.9859 | 2236 | 2.9050          | 0.0129                 | 1.0    | 1.0    |
| 2.8491        | 64.0    | 2272 | 2.8932          | 0.0129                 | 1.0    | 1.0    |
| 2.9179        | 64.9859 | 2307 | 2.8783          | 0.0129                 | 1.0    | 1.0    |
| 2.8239        | 66.0    | 2343 | 2.8657          | 0.0129                 | 1.0    | 0.9974 |
| 2.8902        | 66.9859 | 2378 | 2.8543          | 0.0129                 | 1.0    | 0.9963 |
| 2.7972        | 68.0    | 2414 | 2.8407          | 0.0129                 | 1.0    | 0.9955 |
| 2.8628        | 68.9859 | 2449 | 2.8276          | 0.0129                 | 1.0    | 0.9936 |
| 2.7694        | 70.0    | 2485 | 2.8108          | 0.0129                 | 1.0    | 0.9945 |
| 2.831         | 70.9859 | 2520 | 2.7947          | 0.0129                 | 0.9996 | 0.9919 |
| 2.735         | 72.0    | 2556 | 2.7773          | 0.0129                 | 0.9998 | 0.9888 |
| 2.7981        | 72.9859 | 2591 | 2.7636          | 0.0129                 | 0.9998 | 0.9870 |
| 2.7062        | 74.0    | 2627 | 2.7507          | 0.0129                 | 0.9998 | 0.9846 |
| 2.7699        | 74.9859 | 2662 | 2.7373          | 0.0129                 | 0.9998 | 0.9849 |
| 2.6797        | 76.0    | 2698 | 2.7237          | 0.0129                 | 0.9996 | 0.9818 |
| 2.7434        | 76.9859 | 2733 | 2.7133          | 0.0129                 | 1.0    | 0.9806 |
| 2.6558        | 78.0    | 2769 | 2.7024          | 0.0129                 | 0.9996 | 0.9779 |
| 2.7204        | 78.9859 | 2804 | 2.6910          | 0.0129                 | 0.9998 | 0.9763 |
| 2.6344        | 80.0    | 2840 | 2.6817          | 0.0129                 | 0.9998 | 0.9727 |
| 2.7002        | 80.9859 | 2875 | 2.6726          | 0.0129                 | 0.9998 | 0.9690 |
| 2.6166        | 82.0    | 2911 | 2.6645          | 0.0129                 | 0.9998 | 0.9655 |
| 2.6827        | 82.9859 | 2946 | 2.6571          | 0.0129                 | 1.0    | 0.9599 |
| 2.6014        | 84.0    | 2982 | 2.6503          | 0.0129                 | 1.0    | 0.9549 |
| 2.6693        | 84.9859 | 3017 | 2.6444          | 0.0129                 | 1.0    | 0.9497 |
| 2.5889        | 86.0    | 3053 | 2.6391          | 0.0129                 | 1.0    | 0.9434 |
| 2.6577        | 86.9859 | 3088 | 2.6350          | 0.0129                 | 1.0    | 0.9354 |
| 2.5795        | 88.0    | 3124 | 2.6305          | 0.0129                 | 1.0    | 0.9290 |
| 2.6494        | 88.9859 | 3159 | 2.6275          | 0.0129                 | 1.0    | 0.9249 |
| 2.5731        | 90.0    | 3195 | 2.6248          | 0.0129                 | 1.0    | 0.9217 |
| 2.6435        | 90.9859 | 3230 | 2.6222          | 0.0129                 | 1.0    | 0.9140 |
| 2.5678        | 92.0    | 3266 | 2.6206          | 0.0129                 | 1.0    | 0.9128 |
| 2.6399        | 92.9859 | 3301 | 2.6193          | 0.0129                 | 1.0    | 0.9088 |
| 2.5653        | 94.0    | 3337 | 2.6183          | 0.0129                 | 1.0    | 0.9070 |
| 2.6379        | 94.9859 | 3372 | 2.6177          | 0.0129                 | 1.0    | 0.9043 |
| 2.5642        | 96.0    | 3408 | 2.6175          | 0.0129                 | 1.0    | 0.9052 |
| 2.6369        | 96.9859 | 3443 | 2.6173          | 0.0129                 | 1.0    | 0.9040 |
| 2.5639        | 98.0    | 3479 | 2.6173          | 0.0129                 | 1.0    | 0.9043 |
| 2.5974        | 98.5915 | 3500 | 2.6173          | 0.0129                 | 1.0    | 0.9044 |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.1.0+cu118
- Datasets 2.20.0
- Tokenizers 0.19.1