hubert_zeroth_gpu_scratch
This model is a fine-tuned version of on the zeroth_korean_asr dataset. It achieves the following results on the evaluation set:
- Loss: 4.8280
- Wer: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
10.6349 | 0.14 | 100 | 4.8579 | 1.0 |
4.7539 | 0.29 | 200 | 4.7308 | 1.0 |
4.7255 | 0.43 | 300 | 4.7278 | 1.0 |
4.7124 | 0.57 | 400 | 5.3295 | 1.0 |
4.7543 | 0.72 | 500 | 4.7487 | 1.0 |
4.8932 | 0.86 | 600 | 4.9136 | 1.0 |
4.8533 | 1.01 | 700 | 4.8799 | 1.0 |
4.8483 | 1.15 | 800 | 4.8665 | 1.0 |
4.8424 | 1.29 | 900 | 4.8622 | 1.0 |
4.8426 | 1.44 | 1000 | 4.8506 | 1.0 |
4.8373 | 1.58 | 1100 | 4.8603 | 1.0 |
4.8452 | 1.72 | 1200 | 4.8537 | 1.0 |
4.8391 | 1.87 | 1300 | 4.8520 | 1.0 |
4.8405 | 2.01 | 1400 | 4.8682 | 1.0 |
4.8375 | 2.16 | 1500 | 4.8637 | 1.0 |
4.8413 | 2.3 | 1600 | 4.8664 | 1.0 |
4.8388 | 2.44 | 1700 | 4.8473 | 1.0 |
4.8389 | 2.59 | 1800 | 4.8484 | 1.0 |
4.8343 | 2.73 | 1900 | 4.8629 | 1.0 |
4.8294 | 2.87 | 2000 | 4.8571 | 1.0 |
4.827 | 3.02 | 2100 | 4.8472 | 1.0 |
4.8316 | 3.16 | 2200 | 4.8576 | 1.0 |
4.8241 | 3.3 | 2300 | 4.8398 | 1.0 |
4.8333 | 3.45 | 2400 | 4.8603 | 1.0 |
4.8387 | 3.59 | 2500 | 4.8484 | 1.0 |
4.8312 | 3.74 | 2600 | 4.8420 | 1.0 |
4.8304 | 3.88 | 2700 | 4.8398 | 1.0 |
4.8291 | 4.02 | 2800 | 4.8355 | 1.0 |
4.8326 | 4.17 | 2900 | 4.8415 | 1.0 |
4.8274 | 4.31 | 3000 | 4.8338 | 1.0 |
4.8245 | 4.45 | 3100 | 4.8389 | 1.0 |
4.83 | 4.6 | 3200 | 4.8332 | 1.0 |
4.8335 | 4.74 | 3300 | 4.8393 | 1.0 |
4.829 | 4.89 | 3400 | 4.8352 | 1.0 |
4.832 | 5.03 | 3500 | 4.8329 | 1.0 |
4.8285 | 5.17 | 3600 | 4.8343 | 1.0 |
4.8302 | 5.32 | 3700 | 4.8381 | 1.0 |
4.8371 | 5.46 | 3800 | 4.8426 | 1.0 |
4.8226 | 5.6 | 3900 | 4.8383 | 1.0 |
4.8257 | 5.75 | 4000 | 4.8372 | 1.0 |
4.8222 | 5.89 | 4100 | 4.8332 | 1.0 |
4.8255 | 6.03 | 4200 | 4.8437 | 1.0 |
4.8277 | 6.18 | 4300 | 4.8351 | 1.0 |
4.8257 | 6.32 | 4400 | 4.8368 | 1.0 |
4.8301 | 6.47 | 4500 | 4.8345 | 1.0 |
4.8267 | 6.61 | 4600 | 4.8343 | 1.0 |
4.8296 | 6.75 | 4700 | 4.8388 | 1.0 |
4.828 | 6.9 | 4800 | 4.8374 | 1.0 |
4.8173 | 7.04 | 4900 | 4.8375 | 1.0 |
4.8234 | 7.18 | 5000 | 4.8348 | 1.0 |
4.8233 | 7.33 | 5100 | 4.8349 | 1.0 |
4.8232 | 7.47 | 5200 | 4.8339 | 1.0 |
4.8293 | 7.61 | 5300 | 4.8386 | 1.0 |
4.8305 | 7.76 | 5400 | 4.8385 | 1.0 |
4.8253 | 7.9 | 5500 | 4.8315 | 1.0 |
4.823 | 8.05 | 5600 | 4.8325 | 1.0 |
4.8313 | 8.19 | 5700 | 4.8311 | 1.0 |
4.8284 | 8.33 | 5800 | 4.8329 | 1.0 |
4.8199 | 8.48 | 5900 | 4.8329 | 1.0 |
4.8208 | 8.62 | 6000 | 4.8319 | 1.0 |
4.8315 | 8.76 | 6100 | 4.8334 | 1.0 |
4.8265 | 8.91 | 6200 | 4.8308 | 1.0 |
4.8218 | 9.05 | 6300 | 4.8313 | 1.0 |
4.8172 | 9.2 | 6400 | 4.8294 | 1.0 |
4.8231 | 9.34 | 6500 | 4.8299 | 1.0 |
4.825 | 9.48 | 6600 | 4.8311 | 1.0 |
4.826 | 9.63 | 6700 | 4.8299 | 1.0 |
4.8269 | 9.77 | 6800 | 4.8321 | 1.0 |
4.8275 | 9.91 | 6900 | 4.8306 | 1.0 |
4.8199 | 10.06 | 7000 | 4.8302 | 1.0 |
4.8217 | 10.2 | 7100 | 4.8316 | 1.0 |
4.8237 | 10.34 | 7200 | 4.8296 | 1.0 |
4.8253 | 10.49 | 7300 | 4.8318 | 1.0 |
4.8256 | 10.63 | 7400 | 4.8320 | 1.0 |
4.8265 | 10.78 | 7500 | 4.8297 | 1.0 |
4.8201 | 10.92 | 7600 | 4.8309 | 1.0 |
4.8259 | 11.06 | 7700 | 4.8302 | 1.0 |
4.8216 | 11.21 | 7800 | 4.8315 | 1.0 |
4.8206 | 11.35 | 7900 | 4.8328 | 1.0 |
4.8249 | 11.49 | 8000 | 4.8290 | 1.0 |
4.8231 | 11.64 | 8100 | 4.8297 | 1.0 |
4.8232 | 11.78 | 8200 | 4.8303 | 1.0 |
4.8245 | 11.93 | 8300 | 4.8283 | 1.0 |
4.8224 | 12.07 | 8400 | 4.8309 | 1.0 |
4.822 | 12.21 | 8500 | 4.8341 | 1.0 |
4.8234 | 12.36 | 8600 | 4.8300 | 1.0 |
4.8233 | 12.5 | 8700 | 4.8302 | 1.0 |
4.825 | 12.64 | 8800 | 4.8301 | 1.0 |
4.8246 | 12.79 | 8900 | 4.8310 | 1.0 |
4.8169 | 12.93 | 9000 | 4.8308 | 1.0 |
4.8194 | 13.07 | 9100 | 4.8319 | 1.0 |
4.8182 | 13.22 | 9200 | 4.8334 | 1.0 |
4.8245 | 13.36 | 9300 | 4.8334 | 1.0 |
4.8274 | 13.51 | 9400 | 4.8427 | 1.0 |
4.8194 | 13.65 | 9500 | 4.8393 | 1.0 |
4.825 | 13.79 | 9600 | 4.8368 | 1.0 |
4.8162 | 13.94 | 9700 | 4.8371 | 1.0 |
4.8213 | 14.08 | 9800 | 4.8359 | 1.0 |
4.8275 | 14.22 | 9900 | 4.8330 | 1.0 |
4.8119 | 14.37 | 10000 | 4.8328 | 1.0 |
4.8267 | 14.51 | 10100 | 4.8327 | 1.0 |
4.8218 | 14.66 | 10200 | 4.8328 | 1.0 |
4.8221 | 14.8 | 10300 | 4.8344 | 1.0 |
4.8181 | 14.94 | 10400 | 4.8330 | 1.0 |
4.8204 | 15.09 | 10500 | 4.8326 | 1.0 |
4.8235 | 15.23 | 10600 | 4.8340 | 1.0 |
4.8113 | 15.37 | 10700 | 4.8330 | 1.0 |
4.8268 | 15.52 | 10800 | 4.8330 | 1.0 |
4.8199 | 15.66 | 10900 | 4.8341 | 1.0 |
4.8213 | 15.8 | 11000 | 4.8320 | 1.0 |
4.8268 | 15.95 | 11100 | 4.8345 | 1.0 |
4.8113 | 16.09 | 11200 | 4.8367 | 1.0 |
4.8216 | 16.24 | 11300 | 4.8358 | 1.0 |
4.8287 | 16.38 | 11400 | 4.8343 | 1.0 |
4.8185 | 16.52 | 11500 | 4.8341 | 1.0 |
4.8226 | 16.67 | 11600 | 4.8321 | 1.0 |
4.8187 | 16.81 | 11700 | 4.8337 | 1.0 |
4.8183 | 16.95 | 11800 | 4.8324 | 1.0 |
4.8173 | 17.1 | 11900 | 4.8334 | 1.0 |
4.8217 | 17.24 | 12000 | 4.8338 | 1.0 |
4.8174 | 17.39 | 12100 | 4.8323 | 1.0 |
4.8193 | 17.53 | 12200 | 4.8358 | 1.0 |
4.8203 | 17.67 | 12300 | 4.8313 | 1.0 |
4.8182 | 17.82 | 12400 | 4.8311 | 1.0 |
4.8245 | 17.96 | 12500 | 4.8324 | 1.0 |
4.8195 | 18.1 | 12600 | 4.8301 | 1.0 |
4.8197 | 18.25 | 12700 | 4.8345 | 1.0 |
4.8163 | 18.39 | 12800 | 4.8326 | 1.0 |
4.8227 | 18.53 | 12900 | 4.8319 | 1.0 |
4.8254 | 18.68 | 13000 | 4.8321 | 1.0 |
4.8197 | 18.82 | 13100 | 4.8315 | 1.0 |
4.819 | 18.97 | 13200 | 4.8306 | 1.0 |
4.8106 | 19.11 | 13300 | 4.8297 | 1.0 |
4.8161 | 19.25 | 13400 | 4.8314 | 1.0 |
4.8147 | 19.4 | 13500 | 4.8340 | 1.0 |
4.8237 | 19.54 | 13600 | 4.8313 | 1.0 |
4.8186 | 19.68 | 13700 | 4.8298 | 1.0 |
4.8217 | 19.83 | 13800 | 4.8302 | 1.0 |
4.8239 | 19.97 | 13900 | 4.8297 | 1.0 |
4.8189 | 20.11 | 14000 | 4.8313 | 1.0 |
4.8254 | 20.26 | 14100 | 4.8299 | 1.0 |
4.8166 | 20.4 | 14200 | 4.8297 | 1.0 |
4.8199 | 20.55 | 14300 | 4.8294 | 1.0 |
4.8129 | 20.69 | 14400 | 4.8307 | 1.0 |
4.8175 | 20.83 | 14500 | 4.8285 | 1.0 |
4.8195 | 20.98 | 14600 | 4.8281 | 1.0 |
4.82 | 21.12 | 14700 | 4.8293 | 1.0 |
4.8136 | 21.26 | 14800 | 4.8293 | 1.0 |
4.8177 | 21.41 | 14900 | 4.8287 | 1.0 |
4.826 | 21.55 | 15000 | 4.8288 | 1.0 |
4.8177 | 21.7 | 15100 | 4.8296 | 1.0 |
4.8165 | 21.84 | 15200 | 4.8303 | 1.0 |
4.8246 | 21.98 | 15300 | 4.8282 | 1.0 |
4.8146 | 22.13 | 15400 | 4.8276 | 1.0 |
4.819 | 22.27 | 15500 | 4.8279 | 1.0 |
4.814 | 22.41 | 15600 | 4.8295 | 1.0 |
4.8195 | 22.56 | 15700 | 4.8274 | 1.0 |
4.8189 | 22.7 | 15800 | 4.8275 | 1.0 |
4.822 | 22.84 | 15900 | 4.8274 | 1.0 |
4.8195 | 22.99 | 16000 | 4.8274 | 1.0 |
4.8146 | 23.13 | 16100 | 4.8274 | 1.0 |
4.8126 | 23.28 | 16200 | 4.8271 | 1.0 |
4.8172 | 23.42 | 16300 | 4.8272 | 1.0 |
4.8214 | 23.56 | 16400 | 4.8277 | 1.0 |
4.821 | 23.71 | 16500 | 4.8278 | 1.0 |
4.8212 | 23.85 | 16600 | 4.8274 | 1.0 |
4.819 | 23.99 | 16700 | 4.8277 | 1.0 |
4.8165 | 24.14 | 16800 | 4.8274 | 1.0 |
4.8212 | 24.28 | 16900 | 4.8268 | 1.0 |
4.8198 | 24.43 | 17000 | 4.8272 | 1.0 |
4.8228 | 24.57 | 17100 | 4.8281 | 1.0 |
4.8159 | 24.71 | 17200 | 4.8272 | 1.0 |
4.8123 | 24.86 | 17300 | 4.8274 | 1.0 |
4.8143 | 25.0 | 17400 | 4.8284 | 1.0 |
4.8174 | 25.14 | 17500 | 4.8289 | 1.0 |
4.8243 | 25.29 | 17600 | 4.8276 | 1.0 |
4.8145 | 25.43 | 17700 | 4.8283 | 1.0 |
4.8129 | 25.57 | 17800 | 4.8277 | 1.0 |
4.815 | 25.72 | 17900 | 4.8272 | 1.0 |
4.8155 | 25.86 | 18000 | 4.8279 | 1.0 |
4.8217 | 26.01 | 18100 | 4.8269 | 1.0 |
4.8106 | 26.15 | 18200 | 4.8277 | 1.0 |
4.8188 | 26.29 | 18300 | 4.8270 | 1.0 |
4.8232 | 26.44 | 18400 | 4.8277 | 1.0 |
4.816 | 26.58 | 18500 | 4.8278 | 1.0 |
4.8159 | 26.72 | 18600 | 4.8275 | 1.0 |
4.8199 | 26.87 | 18700 | 4.8274 | 1.0 |
4.8149 | 27.01 | 18800 | 4.8278 | 1.0 |
4.8103 | 27.16 | 18900 | 4.8279 | 1.0 |
4.8244 | 27.3 | 19000 | 4.8275 | 1.0 |
4.8217 | 27.44 | 19100 | 4.8279 | 1.0 |
4.8168 | 27.59 | 19200 | 4.8277 | 1.0 |
4.8111 | 27.73 | 19300 | 4.8287 | 1.0 |
4.816 | 27.87 | 19400 | 4.8279 | 1.0 |
4.8166 | 28.02 | 19500 | 4.8282 | 1.0 |
4.8129 | 28.16 | 19600 | 4.8281 | 1.0 |
4.8207 | 28.3 | 19700 | 4.8275 | 1.0 |
4.8196 | 28.45 | 19800 | 4.8274 | 1.0 |
4.8208 | 28.59 | 19900 | 4.8277 | 1.0 |
4.811 | 28.74 | 20000 | 4.8280 | 1.0 |
4.8176 | 28.88 | 20100 | 4.8280 | 1.0 |
4.8126 | 29.02 | 20200 | 4.8283 | 1.0 |
4.8161 | 29.17 | 20300 | 4.8279 | 1.0 |
4.8134 | 29.31 | 20400 | 4.8278 | 1.0 |
4.8201 | 29.45 | 20500 | 4.8279 | 1.0 |
4.8185 | 29.6 | 20600 | 4.8283 | 1.0 |
4.8174 | 29.74 | 20700 | 4.8280 | 1.0 |
4.8145 | 29.89 | 20800 | 4.8280 | 1.0 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.13.0+cu117
- Datasets 2.0.0
- Tokenizers 0.13.2
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.