hubert_zeroth_gpu
This model is a fine-tuned version of facebook/hubert-base-ls960 on the zeroth_korean_asr dataset. It achieves the following results on the evaluation set:
- Loss: 4.8302
- Wer: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
26.5222 | 0.14 | 100 | 10.9084 | 1.0 |
6.6076 | 0.29 | 200 | 4.8783 | 1.0 |
4.8383 | 0.43 | 300 | 4.8768 | 1.0 |
4.8372 | 0.57 | 400 | 4.8608 | 1.0 |
4.8298 | 0.72 | 500 | 4.8625 | 1.0 |
4.8377 | 0.86 | 600 | 4.8646 | 1.0 |
4.829 | 1.01 | 700 | 4.8472 | 1.0 |
4.8282 | 1.15 | 800 | 4.8435 | 1.0 |
4.8282 | 1.29 | 900 | 4.8438 | 1.0 |
4.8299 | 1.44 | 1000 | 4.8540 | 1.0 |
4.8276 | 1.58 | 1100 | 4.8408 | 1.0 |
4.8306 | 1.72 | 1200 | 4.8390 | 1.0 |
4.8315 | 1.87 | 1300 | 4.8426 | 1.0 |
4.8296 | 2.01 | 1400 | 4.8418 | 1.0 |
4.829 | 2.16 | 1500 | 4.8475 | 1.0 |
4.8324 | 2.3 | 1600 | 4.8409 | 1.0 |
4.8299 | 2.44 | 1700 | 4.8360 | 1.0 |
4.8285 | 2.59 | 1800 | 4.8419 | 1.0 |
4.8267 | 2.73 | 1900 | 4.8355 | 1.0 |
4.8232 | 2.87 | 2000 | 4.8445 | 1.0 |
4.8179 | 3.02 | 2100 | 4.8390 | 1.0 |
4.8248 | 3.16 | 2200 | 4.8506 | 1.0 |
4.8184 | 3.3 | 2300 | 4.8392 | 1.0 |
4.8268 | 3.45 | 2400 | 4.8509 | 1.0 |
4.8315 | 3.59 | 2500 | 4.8469 | 1.0 |
4.8249 | 3.74 | 2600 | 4.8457 | 1.0 |
4.8244 | 3.88 | 2700 | 4.8414 | 1.0 |
4.8226 | 4.02 | 2800 | 4.8333 | 1.0 |
4.8275 | 4.17 | 2900 | 4.8344 | 1.0 |
4.8218 | 4.31 | 3000 | 4.8351 | 1.0 |
4.8199 | 4.45 | 3100 | 4.8386 | 1.0 |
4.825 | 4.6 | 3200 | 4.8344 | 1.0 |
4.828 | 4.74 | 3300 | 4.8372 | 1.0 |
4.8228 | 4.89 | 3400 | 4.8349 | 1.0 |
4.8264 | 5.03 | 3500 | 4.8344 | 1.0 |
4.8237 | 5.17 | 3600 | 4.8332 | 1.0 |
4.8269 | 5.32 | 3700 | 4.8376 | 1.0 |
4.833 | 5.46 | 3800 | 4.8380 | 1.0 |
4.8188 | 5.6 | 3900 | 4.8352 | 1.0 |
4.8208 | 5.75 | 4000 | 4.8354 | 1.0 |
4.8177 | 5.89 | 4100 | 4.8291 | 1.0 |
4.8208 | 6.03 | 4200 | 4.8500 | 1.0 |
4.8242 | 6.18 | 4300 | 4.8369 | 1.0 |
4.8222 | 6.32 | 4400 | 4.8366 | 1.0 |
4.8259 | 6.47 | 4500 | 4.8369 | 1.0 |
4.8231 | 6.61 | 4600 | 4.8319 | 1.0 |
4.825 | 6.75 | 4700 | 4.8363 | 1.0 |
4.8245 | 6.9 | 4800 | 4.8420 | 1.0 |
4.8139 | 7.04 | 4900 | 4.8427 | 1.0 |
4.8202 | 7.18 | 5000 | 4.8393 | 1.0 |
4.8196 | 7.33 | 5100 | 4.8380 | 1.0 |
4.8199 | 7.47 | 5200 | 4.8364 | 1.0 |
4.8264 | 7.61 | 5300 | 4.8414 | 1.0 |
4.8259 | 7.76 | 5400 | 4.8397 | 1.0 |
4.8215 | 7.9 | 5500 | 4.8376 | 1.0 |
4.8198 | 8.05 | 5600 | 4.8344 | 1.0 |
4.828 | 8.19 | 5700 | 4.8314 | 1.0 |
4.8246 | 8.33 | 5800 | 4.8361 | 1.0 |
4.8167 | 8.48 | 5900 | 4.8336 | 1.0 |
4.8174 | 8.62 | 6000 | 4.8345 | 1.0 |
4.8283 | 8.76 | 6100 | 4.8363 | 1.0 |
4.8231 | 8.91 | 6200 | 4.8345 | 1.0 |
4.8191 | 9.05 | 6300 | 4.8327 | 1.0 |
4.8144 | 9.2 | 6400 | 4.8299 | 1.0 |
4.8206 | 9.34 | 6500 | 4.8281 | 1.0 |
4.822 | 9.48 | 6600 | 4.8329 | 1.0 |
4.8228 | 9.63 | 6700 | 4.8309 | 1.0 |
4.8239 | 9.77 | 6800 | 4.8348 | 1.0 |
4.8245 | 9.91 | 6900 | 4.8309 | 1.0 |
4.8173 | 10.06 | 7000 | 4.8303 | 1.0 |
4.8188 | 10.2 | 7100 | 4.8335 | 1.0 |
4.8208 | 10.34 | 7200 | 4.8290 | 1.0 |
4.8228 | 10.49 | 7300 | 4.8316 | 1.0 |
4.8226 | 10.63 | 7400 | 4.8272 | 1.0 |
4.824 | 10.78 | 7500 | 4.8309 | 1.0 |
4.8175 | 10.92 | 7600 | 4.8317 | 1.0 |
4.8234 | 11.06 | 7700 | 4.8271 | 1.0 |
4.8188 | 11.21 | 7800 | 4.8291 | 1.0 |
4.8182 | 11.35 | 7900 | 4.8340 | 1.0 |
4.8224 | 11.49 | 8000 | 4.8309 | 1.0 |
4.8207 | 11.64 | 8100 | 4.8308 | 1.0 |
4.8207 | 11.78 | 8200 | 4.8301 | 1.0 |
4.822 | 11.93 | 8300 | 4.8281 | 1.0 |
4.8199 | 12.07 | 8400 | 4.8301 | 1.0 |
4.8198 | 12.21 | 8500 | 4.8337 | 1.0 |
4.8212 | 12.36 | 8600 | 4.8310 | 1.0 |
4.8211 | 12.5 | 8700 | 4.8304 | 1.0 |
4.8226 | 12.64 | 8800 | 4.8303 | 1.0 |
4.8224 | 12.79 | 8900 | 4.8312 | 1.0 |
4.8146 | 12.93 | 9000 | 4.8362 | 1.0 |
4.8173 | 13.07 | 9100 | 4.8321 | 1.0 |
4.816 | 13.22 | 9200 | 4.8347 | 1.0 |
4.8219 | 13.36 | 9300 | 4.8377 | 1.0 |
4.8251 | 13.51 | 9400 | 4.8403 | 1.0 |
4.8173 | 13.65 | 9500 | 4.8387 | 1.0 |
4.8226 | 13.79 | 9600 | 4.8375 | 1.0 |
4.8137 | 13.94 | 9700 | 4.8364 | 1.0 |
4.819 | 14.08 | 9800 | 4.8323 | 1.0 |
4.8258 | 14.22 | 9900 | 4.8329 | 1.0 |
4.8097 | 14.37 | 10000 | 4.8293 | 1.0 |
4.8247 | 14.51 | 10100 | 4.8311 | 1.0 |
4.8197 | 14.66 | 10200 | 4.8306 | 1.0 |
4.8201 | 14.8 | 10300 | 4.8308 | 1.0 |
4.8158 | 14.94 | 10400 | 4.8319 | 1.0 |
4.818 | 15.09 | 10500 | 4.8306 | 1.0 |
4.8216 | 15.23 | 10600 | 4.8343 | 1.0 |
4.8096 | 15.37 | 10700 | 4.8326 | 1.0 |
4.8248 | 15.52 | 10800 | 4.8323 | 1.0 |
4.8178 | 15.66 | 10900 | 4.8358 | 1.0 |
4.8191 | 15.8 | 11000 | 4.8338 | 1.0 |
4.8248 | 15.95 | 11100 | 4.8359 | 1.0 |
4.8095 | 16.09 | 11200 | 4.8392 | 1.0 |
4.8196 | 16.24 | 11300 | 4.8374 | 1.0 |
4.827 | 16.38 | 11400 | 4.8346 | 1.0 |
4.8165 | 16.52 | 11500 | 4.8365 | 1.0 |
4.8206 | 16.67 | 11600 | 4.8344 | 1.0 |
4.8169 | 16.81 | 11700 | 4.8344 | 1.0 |
4.8164 | 16.95 | 11800 | 4.8390 | 1.0 |
4.8159 | 17.1 | 11900 | 4.8367 | 1.0 |
4.8202 | 17.24 | 12000 | 4.8375 | 1.0 |
4.8156 | 17.39 | 12100 | 4.8362 | 1.0 |
4.8174 | 17.53 | 12200 | 4.8410 | 1.0 |
4.8188 | 17.67 | 12300 | 4.8323 | 1.0 |
4.8167 | 17.82 | 12400 | 4.8319 | 1.0 |
4.8229 | 17.96 | 12500 | 4.8347 | 1.0 |
4.8179 | 18.1 | 12600 | 4.8320 | 1.0 |
4.8182 | 18.25 | 12700 | 4.8384 | 1.0 |
4.8151 | 18.39 | 12800 | 4.8374 | 1.0 |
4.8212 | 18.53 | 12900 | 4.8346 | 1.0 |
4.8241 | 18.68 | 13000 | 4.8344 | 1.0 |
4.8184 | 18.82 | 13100 | 4.8352 | 1.0 |
4.8174 | 18.97 | 13200 | 4.8357 | 1.0 |
4.8092 | 19.11 | 13300 | 4.8332 | 1.0 |
4.8149 | 19.25 | 13400 | 4.8347 | 1.0 |
4.813 | 19.4 | 13500 | 4.8376 | 1.0 |
4.8226 | 19.54 | 13600 | 4.8343 | 1.0 |
4.8175 | 19.68 | 13700 | 4.8320 | 1.0 |
4.8203 | 19.83 | 13800 | 4.8339 | 1.0 |
4.8227 | 19.97 | 13900 | 4.8324 | 1.0 |
4.8177 | 20.11 | 14000 | 4.8356 | 1.0 |
4.824 | 20.26 | 14100 | 4.8339 | 1.0 |
4.815 | 20.4 | 14200 | 4.8342 | 1.0 |
4.8189 | 20.55 | 14300 | 4.8340 | 1.0 |
4.8115 | 20.69 | 14400 | 4.8319 | 1.0 |
4.8162 | 20.83 | 14500 | 4.8288 | 1.0 |
4.8183 | 20.98 | 14600 | 4.8321 | 1.0 |
4.8189 | 21.12 | 14700 | 4.8315 | 1.0 |
4.8123 | 21.26 | 14800 | 4.8311 | 1.0 |
4.8165 | 21.41 | 14900 | 4.8321 | 1.0 |
4.8247 | 21.55 | 15000 | 4.8309 | 1.0 |
4.8165 | 21.7 | 15100 | 4.8313 | 1.0 |
4.815 | 21.84 | 15200 | 4.8354 | 1.0 |
4.8234 | 21.98 | 15300 | 4.8300 | 1.0 |
4.8134 | 22.13 | 15400 | 4.8284 | 1.0 |
4.8178 | 22.27 | 15500 | 4.8298 | 1.0 |
4.8128 | 22.41 | 15600 | 4.8309 | 1.0 |
4.8185 | 22.56 | 15700 | 4.8291 | 1.0 |
4.8177 | 22.7 | 15800 | 4.8288 | 1.0 |
4.8208 | 22.84 | 15900 | 4.8306 | 1.0 |
4.8183 | 22.99 | 16000 | 4.8277 | 1.0 |
4.8135 | 23.13 | 16100 | 4.8286 | 1.0 |
4.8116 | 23.28 | 16200 | 4.8275 | 1.0 |
4.816 | 23.42 | 16300 | 4.8290 | 1.0 |
4.8203 | 23.56 | 16400 | 4.8292 | 1.0 |
4.8198 | 23.71 | 16500 | 4.8299 | 1.0 |
4.8203 | 23.85 | 16600 | 4.8294 | 1.0 |
4.8177 | 23.99 | 16700 | 4.8286 | 1.0 |
4.8153 | 24.14 | 16800 | 4.8275 | 1.0 |
4.8201 | 24.28 | 16900 | 4.8259 | 1.0 |
4.8189 | 24.43 | 17000 | 4.8289 | 1.0 |
4.8219 | 24.57 | 17100 | 4.8280 | 1.0 |
4.8148 | 24.71 | 17200 | 4.8284 | 1.0 |
4.8113 | 24.86 | 17300 | 4.8286 | 1.0 |
4.8133 | 25.0 | 17400 | 4.8293 | 1.0 |
4.8164 | 25.14 | 17500 | 4.8302 | 1.0 |
4.8231 | 25.29 | 17600 | 4.8278 | 1.0 |
4.8136 | 25.43 | 17700 | 4.8296 | 1.0 |
4.8118 | 25.57 | 17800 | 4.8288 | 1.0 |
4.8139 | 25.72 | 17900 | 4.8280 | 1.0 |
4.8144 | 25.86 | 18000 | 4.8282 | 1.0 |
4.8206 | 26.01 | 18100 | 4.8279 | 1.0 |
4.8096 | 26.15 | 18200 | 4.8281 | 1.0 |
4.8177 | 26.29 | 18300 | 4.8271 | 1.0 |
4.8222 | 26.44 | 18400 | 4.8289 | 1.0 |
4.8148 | 26.58 | 18500 | 4.8282 | 1.0 |
4.8148 | 26.72 | 18600 | 4.8277 | 1.0 |
4.819 | 26.87 | 18700 | 4.8283 | 1.0 |
4.8138 | 27.01 | 18800 | 4.8290 | 1.0 |
4.8094 | 27.16 | 18900 | 4.8292 | 1.0 |
4.8236 | 27.3 | 19000 | 4.8282 | 1.0 |
4.8208 | 27.44 | 19100 | 4.8293 | 1.0 |
4.816 | 27.59 | 19200 | 4.8281 | 1.0 |
4.8103 | 27.73 | 19300 | 4.8294 | 1.0 |
4.8152 | 27.87 | 19400 | 4.8297 | 1.0 |
4.8158 | 28.02 | 19500 | 4.8305 | 1.0 |
4.8121 | 28.16 | 19600 | 4.8294 | 1.0 |
4.8199 | 28.3 | 19700 | 4.8292 | 1.0 |
4.8185 | 28.45 | 19800 | 4.8288 | 1.0 |
4.8199 | 28.59 | 19900 | 4.8288 | 1.0 |
4.8102 | 28.74 | 20000 | 4.8292 | 1.0 |
4.8168 | 28.88 | 20100 | 4.8291 | 1.0 |
4.8117 | 29.02 | 20200 | 4.8304 | 1.0 |
4.8156 | 29.17 | 20300 | 4.8295 | 1.0 |
4.8126 | 29.31 | 20400 | 4.8296 | 1.0 |
4.8193 | 29.45 | 20500 | 4.8302 | 1.0 |
4.8175 | 29.6 | 20600 | 4.8301 | 1.0 |
4.8167 | 29.74 | 20700 | 4.8301 | 1.0 |
4.8137 | 29.89 | 20800 | 4.8302 | 1.0 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.13.0+cu117
- Datasets 2.0.0
- Tokenizers 0.13.2
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.