FarhadMadadzade's picture
Training in progress, step 500
0bcf028 verified
|
raw
history blame
16.7 kB
metadata
license: apache-2.0
base_model: jonatasgrosman/wav2vec2-large-xlsr-53-english
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: wav2vec2-large-xlsr-53-english-ser-cosine
    results: []

wav2vec2-large-xlsr-53-english-ser-cosine

This model is a fine-tuned version of jonatasgrosman/wav2vec2-large-xlsr-53-english on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4215
  • Accuracy: 0.8611

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001076429938136877
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_steps: 18
  • num_epochs: 2.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.781 0.01 10 1.8028 0.1545
1.7854 0.02 20 1.7883 0.1964
1.8096 0.02 30 1.7266 0.2555
1.7726 0.03 40 1.7654 0.2219
1.7558 0.04 50 1.6892 0.3180
1.7778 0.05 60 1.6563 0.3336
1.6491 0.06 70 1.6236 0.3665
1.5512 0.07 80 1.5289 0.3804
1.6337 0.07 90 1.4650 0.3977
1.4708 0.08 100 1.3707 0.4700
1.4622 0.09 110 1.4187 0.4412
1.409 0.1 120 1.2115 0.5793
1.3799 0.11 130 1.4589 0.3681
1.1948 0.12 140 1.2008 0.5563
1.1255 0.12 150 1.3140 0.5004
1.3201 0.13 160 1.1924 0.5546
1.137 0.14 170 0.9202 0.6820
0.9879 0.15 180 0.8952 0.6713
1.0591 0.16 190 1.1175 0.6261
1.0489 0.16 200 1.0495 0.6228
1.145 0.17 210 1.0476 0.6048
1.0471 0.18 220 1.0145 0.6360
1.071 0.19 230 0.8197 0.7206
1.0695 0.2 240 0.8922 0.6820
0.9588 0.21 250 0.9974 0.6270
0.9946 0.21 260 0.8327 0.7083
0.8376 0.22 270 0.7972 0.7157
0.9653 0.23 280 1.1024 0.6442
0.9783 0.24 290 0.9703 0.6746
1.1273 0.25 300 0.8766 0.6960
1.0978 0.25 310 0.8021 0.7124
0.7481 0.26 320 0.8639 0.6878
0.9392 0.27 330 0.7483 0.7346
0.8972 0.28 340 0.8086 0.7083
0.812 0.29 350 0.8079 0.7206
0.9077 0.3 360 1.0001 0.6598
0.7214 0.3 370 0.8035 0.7338
0.9227 0.31 380 0.9332 0.6910
0.7574 0.32 390 0.7768 0.7206
1.0059 0.33 400 0.7643 0.7280
0.9047 0.34 410 0.8035 0.7141
0.9737 0.35 420 0.7310 0.7395
0.732 0.35 430 0.8227 0.7165
0.9809 0.36 440 0.7379 0.7502
0.9453 0.37 450 0.7537 0.7264
0.7107 0.38 460 0.7420 0.7272
0.7221 0.39 470 0.8797 0.7075
0.7188 0.39 480 0.7679 0.7379
0.8938 0.4 490 0.6450 0.7617
0.7478 0.41 500 0.7466 0.7453
0.685 0.42 510 0.8612 0.7058
0.8602 0.43 520 0.6979 0.7568
0.6247 0.44 530 0.6357 0.7740
0.8188 0.44 540 0.7325 0.7379
0.7733 0.45 550 0.6679 0.7568
0.6555 0.46 560 0.6318 0.7707
0.7855 0.47 570 0.6164 0.7675
0.8602 0.48 580 0.7241 0.7535
0.7176 0.49 590 0.6710 0.7642
0.639 0.49 600 0.6418 0.7806
0.6366 0.5 610 0.7135 0.7601
0.559 0.51 620 0.7705 0.7329
0.8654 0.52 630 0.8205 0.7313
0.7747 0.53 640 0.7320 0.7592
0.8225 0.53 650 0.6535 0.7642
0.7234 0.54 660 0.6321 0.7666
0.7549 0.55 670 0.6618 0.7592
0.7577 0.56 680 0.7642 0.7436
0.8584 0.57 690 0.6483 0.7847
0.8096 0.58 700 0.7530 0.7576
0.5032 0.58 710 0.8088 0.7264
0.9413 0.59 720 0.6103 0.7806
0.6731 0.6 730 0.6943 0.7609
0.7774 0.61 740 0.5902 0.7938
0.556 0.62 750 0.5710 0.7929
0.609 0.62 760 0.6431 0.7757
0.7012 0.63 770 0.6323 0.7798
0.6209 0.64 780 0.6324 0.7847
0.6434 0.65 790 0.7455 0.7560
0.6942 0.66 800 0.7024 0.7650
0.6962 0.67 810 0.5922 0.7995
0.5535 0.67 820 0.6350 0.7839
0.664 0.68 830 0.6658 0.7740
0.959 0.69 840 0.6383 0.7716
0.9127 0.7 850 0.5963 0.7806
0.5255 0.71 860 0.6133 0.7724
0.7458 0.72 870 0.7485 0.7338
0.8642 0.72 880 0.6233 0.7806
0.4943 0.73 890 0.7533 0.7403
0.5681 0.74 900 0.6017 0.7929
0.5809 0.75 910 0.6061 0.7773
0.5191 0.76 920 0.8872 0.7264
0.8137 0.76 930 0.6620 0.7798
0.9125 0.77 940 0.5708 0.7987
0.6507 0.78 950 0.5563 0.7954
0.5128 0.79 960 0.6318 0.7724
0.7114 0.8 970 0.6168 0.7740
0.583 0.81 980 0.7461 0.7617
0.7679 0.81 990 0.6579 0.7913
0.8284 0.82 1000 0.7556 0.7354
0.5583 0.83 1010 0.6527 0.7691
0.5624 0.84 1020 0.5929 0.8069
0.6102 0.85 1030 0.6791 0.7847
0.5968 0.86 1040 0.6253 0.7888
0.7403 0.86 1050 0.6318 0.7888
0.486 0.87 1060 0.6332 0.7847
0.5785 0.88 1070 0.6594 0.7707
0.7037 0.89 1080 0.6323 0.7740
0.5022 0.9 1090 0.6067 0.7896
0.5631 0.9 1100 0.7094 0.7486
0.7833 0.91 1110 0.5938 0.7921
0.7214 0.92 1120 0.5511 0.7962
0.7912 0.93 1130 0.5588 0.7938
0.684 0.94 1140 0.5046 0.8102
0.7606 0.95 1150 0.5403 0.7970
0.4331 0.95 1160 0.5822 0.7872
0.4767 0.96 1170 0.5382 0.8012
0.4303 0.97 1180 0.4929 0.8258
0.6541 0.98 1190 0.5382 0.8217
0.6647 0.99 1200 0.5436 0.8143
0.6649 1.0 1210 0.5499 0.7970
0.4763 1.0 1220 0.5227 0.8151
0.4252 1.01 1230 0.5697 0.8020
0.4634 1.02 1240 0.5495 0.8127
0.4511 1.03 1250 0.5456 0.8176
0.3716 1.04 1260 0.5608 0.8192
0.5631 1.04 1270 0.5308 0.8266
0.6632 1.05 1280 0.5098 0.8332
0.3734 1.06 1290 0.5800 0.8028
0.4876 1.07 1300 0.5907 0.8028
0.4039 1.08 1310 0.5270 0.8307
0.4644 1.09 1320 0.5837 0.8176
0.5113 1.09 1330 0.5672 0.8110
0.3963 1.1 1340 0.5400 0.8110
0.2716 1.11 1350 0.4932 0.8299
0.4686 1.12 1360 0.5798 0.8184
0.5707 1.13 1370 0.5204 0.8316
0.3846 1.13 1380 0.5716 0.7995
0.4161 1.14 1390 0.5645 0.8127
0.496 1.15 1400 0.5593 0.8209
0.7214 1.16 1410 0.5093 0.8250
0.2983 1.17 1420 0.5079 0.8332
0.6799 1.18 1430 0.5147 0.8225
0.3985 1.18 1440 0.4981 0.8233
0.4708 1.19 1450 0.5347 0.8151
0.3934 1.2 1460 0.5083 0.8291
0.3551 1.21 1470 0.4628 0.8389
0.4603 1.22 1480 0.4809 0.8291
0.5154 1.23 1490 0.4974 0.8324
0.2846 1.23 1500 0.4795 0.8340
0.3706 1.24 1510 0.5729 0.8118
0.5806 1.25 1520 0.5514 0.8135
0.498 1.26 1530 0.4836 0.8258
0.5659 1.27 1540 0.4983 0.8324
0.4405 1.27 1550 0.4946 0.8299
0.2045 1.28 1560 0.5115 0.8274
0.2957 1.29 1570 0.5368 0.8324
0.4804 1.3 1580 0.5001 0.8414
0.4293 1.31 1590 0.5220 0.8283
0.5119 1.32 1600 0.5667 0.8176
0.3852 1.32 1610 0.5936 0.8135
0.5869 1.33 1620 0.5186 0.8266
0.6688 1.34 1630 0.5559 0.8168
0.3449 1.35 1640 0.5264 0.8307
0.387 1.36 1650 0.4626 0.8406
0.2209 1.37 1660 0.4919 0.8283
0.4323 1.37 1670 0.4556 0.8463
0.4921 1.38 1680 0.4789 0.8414
0.3967 1.39 1690 0.4715 0.8381
0.3463 1.4 1700 0.5216 0.8209
0.3331 1.41 1710 0.5219 0.8168
0.3396 1.41 1720 0.5132 0.8316
0.1933 1.42 1730 0.4971 0.8496
0.664 1.43 1740 0.4971 0.8439
0.199 1.44 1750 0.5231 0.8291
0.5669 1.45 1760 0.4746 0.8414
0.3281 1.46 1770 0.4511 0.8447
0.4047 1.46 1780 0.4461 0.8505
0.5053 1.47 1790 0.4577 0.8463
0.3584 1.48 1800 0.4407 0.8505
0.4758 1.49 1810 0.4477 0.8488
0.3109 1.5 1820 0.4337 0.8422
0.3695 1.5 1830 0.4365 0.8406
0.6448 1.51 1840 0.4425 0.8373
0.5054 1.52 1850 0.4501 0.8291
0.3598 1.53 1860 0.4353 0.8365
0.3951 1.54 1870 0.4303 0.8455
0.4661 1.55 1880 0.4463 0.8431
0.425 1.55 1890 0.4427 0.8463
0.3758 1.56 1900 0.4388 0.8463
0.2416 1.57 1910 0.4330 0.8447
0.5203 1.58 1920 0.4369 0.8472
0.3258 1.59 1930 0.4340 0.8472
0.2536 1.6 1940 0.4303 0.8513
0.4079 1.6 1950 0.4336 0.8488
0.5333 1.61 1960 0.4404 0.8496
0.3799 1.62 1970 0.4395 0.8496
0.3737 1.63 1980 0.4426 0.8472
0.2297 1.64 1990 0.4382 0.8546
0.534 1.64 2000 0.4312 0.8554
0.4614 1.65 2010 0.4293 0.8521
0.3417 1.66 2020 0.4249 0.8513
0.3067 1.67 2030 0.4265 0.8537
0.2893 1.68 2040 0.4259 0.8570
0.4271 1.69 2050 0.4257 0.8611
0.4641 1.69 2060 0.4192 0.8587
0.3295 1.7 2070 0.4180 0.8578
0.4281 1.71 2080 0.4157 0.8595
0.3737 1.72 2090 0.4149 0.8554
0.3418 1.73 2100 0.4207 0.8537
0.4394 1.74 2110 0.4257 0.8554
0.2029 1.74 2120 0.4287 0.8554
0.3565 1.75 2130 0.4320 0.8537
0.1855 1.76 2140 0.4277 0.8570
0.3926 1.77 2150 0.4260 0.8546
0.3638 1.78 2160 0.4241 0.8587
0.2738 1.78 2170 0.4229 0.8587
0.3652 1.79 2180 0.4212 0.8587
0.3871 1.8 2190 0.4215 0.8595
0.2194 1.81 2200 0.4229 0.8587
0.2666 1.82 2210 0.4246 0.8570
0.2705 1.83 2220 0.4251 0.8595
0.3726 1.83 2230 0.4257 0.8603
0.4248 1.84 2240 0.4253 0.8611
0.4365 1.85 2250 0.4239 0.8595
0.2785 1.86 2260 0.4230 0.8603
0.3871 1.87 2270 0.4226 0.8611
0.351 1.88 2280 0.4228 0.8603
0.4279 1.88 2290 0.4226 0.8603
0.4634 1.89 2300 0.4219 0.8595
0.3642 1.9 2310 0.4212 0.8603
0.4088 1.91 2320 0.4211 0.8595
0.5017 1.92 2330 0.4210 0.8603
0.3176 1.92 2340 0.4212 0.8603
0.2854 1.93 2350 0.4214 0.8603
0.3411 1.94 2360 0.4215 0.8603
0.418 1.95 2370 0.4216 0.8603
0.2308 1.96 2380 0.4216 0.8611
0.4071 1.97 2390 0.4216 0.8611
0.3283 1.97 2400 0.4215 0.8611
0.3909 1.98 2410 0.4215 0.8611
0.3095 1.99 2420 0.4215 0.8611
0.3992 2.0 2430 0.4215 0.8611

Framework versions

  • Transformers 4.40.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.1.dev0
  • Tokenizers 0.15.2