Edit model card

Whisper Large V2

This model is a fine-tuned version of openai/whisper-large-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2647
  • Wer: 9.9296

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 12
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Wer
0.6181 0.09 30 0.3443 17.6297
0.3296 0.19 60 0.2921 13.0722
0.3165 0.28 90 0.2711 11.8459
0.2775 0.38 120 0.2677 11.4110
0.2696 0.47 150 0.2570 12.1474
0.2558 0.57 180 0.2544 13.9623
0.272 0.66 210 0.2448 19.9809
0.2696 0.76 240 0.2415 12.5359
0.2668 0.85 270 0.2392 11.7154
0.2558 0.95 300 0.2318 12.3097
0.2108 1.04 330 0.2418 15.0639
0.1339 1.14 360 0.2409 13.8318
0.1326 1.23 390 0.2394 16.9165
0.1357 1.33 420 0.2362 11.4371
0.1278 1.42 450 0.2377 12.3503
0.1485 1.52 480 0.2291 12.3097
0.1263 1.61 510 0.2381 12.2547
0.1269 1.71 540 0.2328 10.3528
0.1437 1.8 570 0.2284 11.7415
0.1314 1.9 600 0.2270 11.4661
0.1282 1.99 630 0.2334 11.8807
0.0646 2.09 660 0.2414 11.3617
0.0655 2.18 690 0.2439 11.9503
0.0658 2.28 720 0.2406 11.4197
0.0573 2.37 750 0.2375 11.0747
0.0584 2.47 780 0.2364 10.7848
0.0639 2.56 810 0.2351 11.2197
0.0537 2.66 840 0.2380 10.0803
0.0538 2.75 870 0.2359 10.1180
0.0551 2.85 900 0.2347 10.1847
0.0613 2.94 930 0.2354 10.7616
0.0453 3.04 960 0.2399 13.9130
0.0248 3.13 990 0.2456 11.4139
0.0246 3.23 1020 0.2553 11.3936
0.0229 3.32 1050 0.2477 11.3878
0.0198 3.42 1080 0.2486 10.2137
0.0229 3.51 1110 0.2491 10.2514
0.021 3.61 1140 0.2478 10.4311
0.0212 3.7 1170 0.2482 10.4398
0.0218 3.8 1200 0.2474 10.8892
0.0225 3.89 1230 0.2442 10.3731
0.0209 3.99 1260 0.2439 10.5326
0.0105 4.08 1290 0.2544 10.2948
0.0089 4.18 1320 0.2614 10.3238
0.008 4.27 1350 0.2627 10.0223
0.0079 4.37 1380 0.2652 10.2456
0.0083 4.46 1410 0.2646 10.1267
0.0073 4.56 1440 0.2619 10.0136
0.0075 4.65 1470 0.2633 9.7266
0.0068 4.75 1500 0.2648 9.8281
0.0074 4.84 1530 0.2645 9.8194
0.0079 4.94 1560 0.2647 9.9296

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.0
Downloads last month
15
Safetensors
Model size
1.54B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for golesheed/whisper-6-dutch

Finetuned
(183)
this model