golesheed's picture
End of training
76fe42e verified
|
raw
history blame
9.87 kB
metadata
library_name: transformers
language:
  - nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: Whisper Large V2
    results: []

Whisper Large V2

This model is a fine-tuned version of openai/whisper-large-v2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2718
  • Wer: 12.0890

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 12
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Wer
0.6207 0.0363 15 0.4288 32.3752
0.3918 0.0726 30 0.3455 20.7368
0.3211 0.1090 45 0.3369 20.8217
0.334 0.1453 60 0.3248 25.7734
0.2569 0.1816 75 0.3246 20.9820
0.3159 0.2179 90 0.3134 25.2413
0.3103 0.2542 105 0.3077 22.0887
0.2935 0.2906 120 0.3057 38.6207
0.2732 0.3269 135 0.2989 18.7783
0.2991 0.3632 150 0.2998 17.9793
0.2969 0.3995 165 0.2960 53.2697
0.2613 0.4358 180 0.2945 17.8562
0.2805 0.4722 195 0.2835 20.5019
0.2799 0.5085 210 0.2936 52.1934
0.2683 0.5448 225 0.2765 55.7750
0.2678 0.5811 240 0.2794 30.6948
0.2499 0.6174 255 0.2770 16.5277
0.2575 0.6538 270 0.2740 31.0770
0.2667 0.6901 285 0.2694 24.9926
0.2807 0.7264 300 0.2657 27.1496
0.2671 0.7627 315 0.2672 22.6823
0.2783 0.7990 330 0.2604 15.4904
0.225 0.8354 345 0.2594 14.7876
0.243 0.8717 360 0.2613 15.3180
0.2514 0.9080 375 0.2521 13.9288
0.25 0.9443 390 0.2523 24.4441
0.2498 0.9806 405 0.2496 13.9895
0.2133 1.0169 420 0.2483 21.0375
0.1285 1.0533 435 0.2591 15.6542
0.1388 1.0896 450 0.2513 17.0113
0.1318 1.1259 465 0.2523 14.6238
0.1187 1.1622 480 0.2500 15.2218
0.1357 1.1985 495 0.2490 15.0198
0.1225 1.2349 510 0.2461 14.6593
0.1258 1.2712 525 0.2466 16.0043
0.1089 1.3075 540 0.2505 13.6654
0.1375 1.3438 555 0.2467 14.4479
0.1251 1.3801 570 0.2450 16.3813
0.1413 1.4165 585 0.2465 14.1948
0.1286 1.4528 600 0.2512 15.9974
0.1345 1.4891 615 0.2416 16.1057
0.133 1.5254 630 0.2384 13.7381
0.132 1.5617 645 0.2389 13.6697
0.1314 1.5981 660 0.2382 13.4331
0.1509 1.6344 675 0.2355 15.0337
0.1427 1.6707 690 0.2399 19.7359
0.1105 1.7070 705 0.2350 12.7000
0.112 1.7433 720 0.2402 13.1818
0.1401 1.7797 735 0.2327 12.9339
0.1396 1.8160 750 0.2304 12.3828
0.136 1.8523 765 0.2287 13.1263
0.1231 1.8886 780 0.2333 14.8708
0.1216 1.9249 795 0.2297 16.6464
0.1174 1.9613 810 0.2276 14.5008
0.1181 1.9976 825 0.2332 13.7295
0.0624 2.0339 840 0.2484 12.7234
0.0706 2.0702 855 0.2373 19.1578
0.0642 2.1065 870 0.2418 12.6627
0.0716 2.1429 885 0.2425 13.5371
0.0525 2.1792 900 0.2389 14.8656
0.0777 2.2155 915 0.2339 14.8517
0.0608 2.2518 930 0.2383 13.0015
0.0604 2.2881 945 0.2356 13.4054
0.0662 2.3245 960 0.2356 13.6983
0.0608 2.3608 975 0.2393 17.8094
0.0653 2.3971 990 0.2327 16.9290
0.0627 2.4334 1005 0.2357 13.6038
0.062 2.4697 1020 0.2312 12.3230
0.0576 2.5061 1035 0.2341 13.1861
0.0689 2.5424 1050 0.2311 13.4201
0.055 2.5787 1065 0.2359 13.2728
0.0549 2.6150 1080 0.2317 14.2668
0.0548 2.6513 1095 0.2319 12.5076
0.0516 2.6877 1110 0.2363 13.6420
0.0528 2.7240 1125 0.2336 12.1982
0.0614 2.7603 1140 0.2311 13.2737
0.0569 2.7966 1155 0.2342 12.6601
0.0478 2.8329 1170 0.2297 13.1307
0.065 2.8692 1185 0.2276 13.2182
0.0492 2.9056 1200 0.2351 12.6402
0.0596 2.9419 1215 0.2274 11.7580
0.0647 2.9782 1230 0.2289 12.5284
0.048 3.0145 1245 0.2341 12.0916
0.0196 3.0508 1260 0.2496 13.0735
0.0274 3.0872 1275 0.2452 12.2493
0.0219 3.1235 1290 0.2398 12.6055
0.0237 3.1598 1305 0.2413 12.8872
0.027 3.1961 1320 0.2414 12.0492
0.0203 3.2324 1335 0.2509 12.3065
0.0233 3.2688 1350 0.2421 11.7536
0.0243 3.3051 1365 0.2425 11.7623
0.0178 3.3414 1380 0.2442 11.3715
0.0229 3.3777 1395 0.2444 11.8464
0.0218 3.4140 1410 0.2485 11.0933
0.0177 3.4504 1425 0.2452 11.3585
0.0211 3.4867 1440 0.2440 12.4669
0.0212 3.5230 1455 0.2447 12.4140
0.0226 3.5593 1470 0.2399 12.2875
0.0212 3.5956 1485 0.2436 12.4140
0.0221 3.6320 1500 0.2506 11.4304
0.0222 3.6683 1515 0.2434 11.1462
0.0261 3.7046 1530 0.2385 11.7268
0.0208 3.7409 1545 0.2447 12.7416
0.018 3.7772 1560 0.2488 12.2883
0.0245 3.8136 1575 0.2389 11.5231
0.0182 3.8499 1590 0.2415 14.8587
0.0245 3.8862 1605 0.2416 12.1410
0.0216 3.9225 1620 0.2389 10.9174
0.0173 3.9588 1635 0.2418 10.9044
0.0238 3.9952 1650 0.2427 11.7458
0.0109 4.0315 1665 0.2480 12.4651
0.0066 4.0678 1680 0.2601 11.1817
0.0063 4.1041 1695 0.2645 11.0508
0.007 4.1404 1710 0.2670 11.4815
0.0075 4.1768 1725 0.2678 11.7996
0.0062 4.2131 1740 0.2653 12.3273
0.0068 4.2494 1755 0.2656 13.6402
0.007 4.2857 1770 0.2650 13.8161
0.0078 4.3220 1785 0.2660 12.8785
0.007 4.3584 1800 0.2674 12.9296
0.0072 4.3947 1815 0.2667 11.5335
0.0058 4.4310 1830 0.2673 11.4235
0.0051 4.4673 1845 0.2673 11.5630
0.0067 4.5036 1860 0.2699 11.2588
0.0085 4.5400 1875 0.2672 11.1618
0.0054 4.5763 1890 0.2656 12.3143
0.0061 4.6126 1905 0.2667 11.3862
0.0052 4.6489 1920 0.2673 11.3793
0.0084 4.6852 1935 0.2683 11.2865
0.005 4.7215 1950 0.2693 11.3229
0.0053 4.7579 1965 0.2726 11.5266
0.0052 4.7942 1980 0.2740 11.6679
0.0051 4.8305 1995 0.2729 11.4573
0.0049 4.8668 2010 0.2724 11.4980
0.0058 4.9031 2025 0.2720 11.7450
0.0047 4.9395 2040 0.2717 11.9235
0.0064 4.9758 2055 0.2718 12.0890

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1