Edit model card

wav2vec2-5Class-Validation-Mobil

This model is a fine-tuned version of anderloh/Hugginhface-master-wav2vec-pretreined-5-class-train-test on the anderloh/ValidateRes dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2514
  • Accuracy: 0.5836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 0
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 300.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.6024 0.3203
No log 1.85 6 1.6022 0.3167
No log 2.77 9 1.6020 0.3167
No log 4.0 13 1.6015 0.3167
No log 4.92 16 1.6009 0.3167
No log 5.85 19 1.6003 0.3132
No log 6.77 22 1.5996 0.3060
No log 8.0 26 1.5984 0.2989
No log 8.92 29 1.5974 0.2918
No log 9.85 32 1.5964 0.2740
No log 10.77 35 1.5952 0.2598
No log 12.0 39 1.5934 0.2633
No log 12.92 42 1.5920 0.2740
No log 13.85 45 1.5905 0.2989
No log 14.77 48 1.5889 0.2989
No log 16.0 52 1.5868 0.2847
No log 16.92 55 1.5851 0.2847
No log 17.85 58 1.5833 0.2847
No log 18.77 61 1.5816 0.2633
No log 20.0 65 1.5790 0.2456
No log 20.92 68 1.5770 0.2420
No log 21.85 71 1.5748 0.2349
No log 22.77 74 1.5728 0.2313
No log 24.0 78 1.5699 0.2278
No log 24.92 81 1.5678 0.2313
No log 25.85 84 1.5657 0.2313
No log 26.77 87 1.5638 0.2313
No log 28.0 91 1.5613 0.2313
No log 28.92 94 1.5597 0.2313
No log 29.85 97 1.5588 0.2313
1.561 30.77 100 1.5586 0.2313
1.561 32.0 104 1.5597 0.2313
1.561 32.92 107 1.5619 0.2313
1.561 33.85 110 1.5661 0.2313
1.561 34.77 113 1.5720 0.2313
1.561 36.0 117 1.5833 0.2313
1.561 36.92 120 1.5957 0.2313
1.561 37.85 123 1.6120 0.2313
1.561 38.77 126 1.6318 0.2313
1.561 40.0 130 1.6638 0.2313
1.561 40.92 133 1.6905 0.2313
1.561 41.85 136 1.7197 0.2313
1.561 42.77 139 1.7503 0.2313
1.561 44.0 143 1.7803 0.2313
1.561 44.92 146 1.7917 0.2313
1.561 45.85 149 1.7920 0.2313
1.561 46.77 152 1.7869 0.2313
1.561 48.0 156 1.7700 0.2598
1.561 48.92 159 1.7525 0.2740
1.561 49.85 162 1.7407 0.2776
1.561 50.77 165 1.7307 0.2918
1.561 52.0 169 1.7241 0.3096
1.561 52.92 172 1.7243 0.3167
1.561 53.85 175 1.7254 0.3167
1.561 54.77 178 1.7233 0.3238
1.561 56.0 182 1.7225 0.3238
1.561 56.92 185 1.7187 0.3274
1.561 57.85 188 1.7172 0.3274
1.561 58.77 191 1.7146 0.3345
1.561 60.0 195 1.7120 0.3488
1.561 60.92 198 1.7049 0.3559
1.3094 61.85 201 1.7022 0.3594
1.3094 62.77 204 1.6912 0.3737
1.3094 64.0 208 1.6798 0.3772
1.3094 64.92 211 1.6687 0.3808
1.3094 65.85 214 1.6569 0.3843
1.3094 66.77 217 1.6427 0.3915
1.3094 68.0 221 1.6301 0.3915
1.3094 68.92 224 1.6217 0.3950
1.3094 69.85 227 1.6203 0.3950
1.3094 70.77 230 1.6257 0.3950
1.3094 72.0 234 1.6192 0.4021
1.3094 72.92 237 1.6044 0.4093
1.3094 73.85 240 1.5868 0.4306
1.3094 74.77 243 1.5787 0.4377
1.3094 76.0 247 1.5762 0.4342
1.3094 76.92 250 1.5717 0.4377
1.3094 77.85 253 1.5674 0.4342
1.3094 78.77 256 1.5684 0.4270
1.3094 80.0 260 1.5619 0.4270
1.3094 80.92 263 1.5555 0.4306
1.3094 81.85 266 1.5505 0.4342
1.3094 82.77 269 1.5386 0.4413
1.3094 84.0 273 1.5362 0.4377
1.3094 84.92 276 1.5411 0.4342
1.3094 85.85 279 1.5453 0.4342
1.3094 86.77 282 1.5611 0.4270
1.3094 88.0 286 1.5766 0.4199
1.3094 88.92 289 1.5781 0.4199
1.3094 89.85 292 1.5675 0.4235
1.3094 90.77 295 1.5588 0.4270
1.3094 92.0 299 1.5496 0.4270
1.0538 92.92 302 1.5493 0.4270
1.0538 93.85 305 1.5540 0.4235
1.0538 94.77 308 1.5620 0.4164
1.0538 96.0 312 1.5648 0.4164
1.0538 96.92 315 1.5617 0.4164
1.0538 97.85 318 1.5461 0.4235
1.0538 98.77 321 1.5348 0.4306
1.0538 100.0 325 1.5346 0.4306
1.0538 100.92 328 1.5466 0.4164
1.0538 101.85 331 1.5547 0.4128
1.0538 102.77 334 1.5560 0.4128
1.0538 104.0 338 1.5315 0.4306
1.0538 104.92 341 1.5124 0.4448
1.0538 105.85 344 1.5044 0.4448
1.0538 106.77 347 1.5010 0.4484
1.0538 108.0 351 1.5005 0.4448
1.0538 108.92 354 1.4992 0.4448
1.0538 109.85 357 1.4994 0.4484
1.0538 110.77 360 1.4988 0.4520
1.0538 112.0 364 1.5005 0.4662
1.0538 112.92 367 1.5010 0.4733
1.0538 113.85 370 1.4969 0.4698
1.0538 114.77 373 1.4776 0.4733
1.0538 116.0 377 1.4528 0.4769
1.0538 116.92 380 1.4395 0.4947
1.0538 117.85 383 1.4310 0.4982
1.0538 118.77 386 1.4315 0.4947
1.0538 120.0 390 1.4389 0.4947
1.0538 120.92 393 1.4375 0.4982
1.0538 121.85 396 1.4381 0.4982
1.0538 122.77 399 1.4247 0.4982
0.8509 124.0 403 1.4196 0.4982
0.8509 124.92 406 1.4179 0.5053
0.8509 125.85 409 1.4091 0.5053
0.8509 126.77 412 1.3958 0.5053
0.8509 128.0 416 1.3736 0.5089
0.8509 128.92 419 1.3661 0.5089
0.8509 129.85 422 1.3694 0.5125
0.8509 130.77 425 1.3808 0.5125
0.8509 132.0 429 1.3819 0.5125
0.8509 132.92 432 1.3859 0.5125
0.8509 133.85 435 1.3780 0.5231
0.8509 134.77 438 1.3696 0.5231
0.8509 136.0 442 1.3564 0.5302
0.8509 136.92 445 1.3421 0.5338
0.8509 137.85 448 1.3256 0.5374
0.8509 138.77 451 1.3274 0.5374
0.8509 140.0 455 1.3402 0.5409
0.8509 140.92 458 1.3517 0.5409
0.8509 141.85 461 1.3585 0.5409
0.8509 142.77 464 1.3592 0.5374
0.8509 144.0 468 1.3329 0.5480
0.8509 144.92 471 1.3126 0.5480
0.8509 145.85 474 1.3076 0.5445
0.8509 146.77 477 1.3146 0.5480
0.8509 148.0 481 1.3345 0.5445
0.8509 148.92 484 1.3409 0.5445
0.8509 149.85 487 1.3374 0.5445
0.8509 150.77 490 1.3227 0.5480
0.8509 152.0 494 1.3201 0.5445
0.8509 152.92 497 1.3174 0.5445
0.7118 153.85 500 1.3073 0.5445
0.7118 154.77 503 1.2984 0.5552
0.7118 156.0 507 1.2975 0.5516
0.7118 156.92 510 1.3027 0.5516
0.7118 157.85 513 1.3089 0.5480
0.7118 158.77 516 1.3139 0.5480
0.7118 160.0 520 1.3068 0.5552
0.7118 160.92 523 1.3011 0.5552
0.7118 161.85 526 1.2957 0.5552
0.7118 162.77 529 1.2960 0.5552
0.7118 164.0 533 1.3159 0.5516
0.7118 164.92 536 1.3257 0.5516
0.7118 165.85 539 1.3312 0.5516
0.7118 166.77 542 1.3222 0.5516
0.7118 168.0 546 1.2986 0.5552
0.7118 168.92 549 1.2898 0.5587
0.7118 169.85 552 1.2938 0.5552
0.7118 170.77 555 1.2902 0.5552
0.7118 172.0 559 1.2879 0.5658
0.7118 172.92 562 1.2838 0.5658
0.7118 173.85 565 1.2812 0.5658
0.7118 174.77 568 1.2864 0.5658
0.7118 176.0 572 1.2934 0.5552
0.7118 176.92 575 1.2940 0.5587
0.7118 177.85 578 1.2988 0.5587
0.7118 178.77 581 1.2953 0.5623
0.7118 180.0 585 1.2972 0.5587
0.7118 180.92 588 1.2936 0.5658
0.7118 181.85 591 1.2928 0.5658
0.7118 182.77 594 1.2913 0.5658
0.7118 184.0 598 1.2825 0.5658
0.6473 184.92 601 1.2736 0.5694
0.6473 185.85 604 1.2715 0.5694
0.6473 186.77 607 1.2704 0.5694
0.6473 188.0 611 1.2717 0.5694
0.6473 188.92 614 1.2724 0.5658
0.6473 189.85 617 1.2763 0.5658
0.6473 190.77 620 1.2812 0.5658
0.6473 192.0 624 1.2791 0.5658
0.6473 192.92 627 1.2698 0.5694
0.6473 193.85 630 1.2695 0.5694
0.6473 194.77 633 1.2704 0.5694
0.6473 196.0 637 1.2737 0.5658
0.6473 196.92 640 1.2782 0.5658
0.6473 197.85 643 1.2814 0.5623
0.6473 198.77 646 1.2819 0.5623
0.6473 200.0 650 1.2746 0.5658
0.6473 200.92 653 1.2694 0.5658
0.6473 201.85 656 1.2625 0.5765
0.6473 202.77 659 1.2575 0.5801
0.6473 204.0 663 1.2549 0.5801
0.6473 204.92 666 1.2623 0.5730
0.6473 205.85 669 1.2665 0.5658
0.6473 206.77 672 1.2684 0.5658
0.6473 208.0 676 1.2770 0.5623
0.6473 208.92 679 1.2808 0.5623
0.6473 209.85 682 1.2762 0.5730
0.6473 210.77 685 1.2759 0.5730
0.6473 212.0 689 1.2752 0.5730
0.6473 212.92 692 1.2754 0.5730
0.6473 213.85 695 1.2722 0.5765
0.6473 214.77 698 1.2739 0.5765
0.613 216.0 702 1.2783 0.5765
0.613 216.92 705 1.2775 0.5765
0.613 217.85 708 1.2741 0.5765
0.613 218.77 711 1.2706 0.5765
0.613 220.0 715 1.2628 0.5765
0.613 220.92 718 1.2581 0.5801
0.613 221.85 721 1.2568 0.5765
0.613 222.77 724 1.2559 0.5730
0.613 224.0 728 1.2503 0.5765
0.613 224.92 731 1.2498 0.5765
0.613 225.85 734 1.2500 0.5765
0.613 226.77 737 1.2490 0.5765
0.613 228.0 741 1.2532 0.5765
0.613 228.92 744 1.2572 0.5765
0.613 229.85 747 1.2599 0.5765
0.613 230.77 750 1.2601 0.5730
0.613 232.0 754 1.2625 0.5730
0.613 232.92 757 1.2636 0.5765
0.613 233.85 760 1.2629 0.5765
0.613 234.77 763 1.2600 0.5765
0.613 236.0 767 1.2559 0.5801
0.613 236.92 770 1.2534 0.5801
0.613 237.85 773 1.2514 0.5836
0.613 238.77 776 1.2508 0.5836
0.613 240.0 780 1.2488 0.5836
0.613 240.92 783 1.2483 0.5836
0.613 241.85 786 1.2500 0.5836
0.613 242.77 789 1.2504 0.5801
0.613 244.0 793 1.2521 0.5801
0.613 244.92 796 1.2533 0.5801
0.613 245.85 799 1.2513 0.5801
0.5946 246.77 802 1.2513 0.5801
0.5946 248.0 806 1.2507 0.5801
0.5946 248.92 809 1.2492 0.5836
0.5946 249.85 812 1.2500 0.5801
0.5946 250.77 815 1.2505 0.5801
0.5946 252.0 819 1.2519 0.5801
0.5946 252.92 822 1.2531 0.5801
0.5946 253.85 825 1.2538 0.5801
0.5946 254.77 828 1.2532 0.5801
0.5946 256.0 832 1.2528 0.5801
0.5946 256.92 835 1.2528 0.5801
0.5946 257.85 838 1.2521 0.5836
0.5946 258.77 841 1.2526 0.5836
0.5946 260.0 845 1.2528 0.5836
0.5946 260.92 848 1.2529 0.5836
0.5946 261.85 851 1.2528 0.5836
0.5946 262.77 854 1.2517 0.5836
0.5946 264.0 858 1.2512 0.5836
0.5946 264.92 861 1.2512 0.5836
0.5946 265.85 864 1.2504 0.5836
0.5946 266.77 867 1.2499 0.5836
0.5946 268.0 871 1.2496 0.5836
0.5946 268.92 874 1.2497 0.5836
0.5946 269.85 877 1.2500 0.5836
0.5946 270.77 880 1.2500 0.5836
0.5946 272.0 884 1.2499 0.5836
0.5946 272.92 887 1.2501 0.5836
0.5946 273.85 890 1.2504 0.5836
0.5946 274.77 893 1.2506 0.5836
0.5946 276.0 897 1.2506 0.5836
0.588 276.92 900 1.2506 0.5836

Framework versions

  • Transformers 4.39.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
0
Safetensors
Model size
13M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for anderloh/wav2vec2-5Class-Validation-Mobil