Edit model card

wav2vec2-5Class-Validation-Mic

This model is a fine-tuned version of anderloh/Hugginhface-master-wav2vec-pretreined-5-class-train-test on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5967
  • Accuracy: 0.4057

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 0
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 150.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.6032 0.3203
No log 1.85 6 1.6029 0.3203
No log 2.77 9 1.6024 0.3203
No log 4.0 13 1.6015 0.3025
No log 4.92 16 1.6005 0.3025
No log 5.85 19 1.5994 0.2811
No log 6.77 22 1.5981 0.2705
No log 8.0 26 1.5959 0.2562
No log 8.92 29 1.5941 0.2384
No log 9.85 32 1.5923 0.2206
No log 10.77 35 1.5902 0.2384
No log 12.0 39 1.5872 0.2384
No log 12.92 42 1.5848 0.2491
No log 13.85 45 1.5822 0.2633
No log 14.77 48 1.5797 0.2633
No log 16.0 52 1.5768 0.2384
No log 16.92 55 1.5747 0.2278
No log 17.85 58 1.5729 0.2278
No log 18.77 61 1.5713 0.2313
No log 20.0 65 1.5694 0.2313
No log 20.92 68 1.5681 0.2313
No log 21.85 71 1.5670 0.2313
No log 22.77 74 1.5666 0.2313
No log 24.0 78 1.5666 0.2313
No log 24.92 81 1.5672 0.2313
No log 25.85 84 1.5685 0.2313
No log 26.77 87 1.5707 0.2313
No log 28.0 91 1.5751 0.2313
No log 28.92 94 1.5796 0.2313
No log 29.85 97 1.5857 0.2313
1.5332 30.77 100 1.5937 0.2313
1.5332 32.0 104 1.6070 0.2313
1.5332 32.92 107 1.6198 0.2313
1.5332 33.85 110 1.6357 0.2313
1.5332 34.77 113 1.6535 0.2313
1.5332 36.0 117 1.6803 0.2313
1.5332 36.92 120 1.7035 0.2313
1.5332 37.85 123 1.7277 0.2313
1.5332 38.77 126 1.7509 0.2313
1.5332 40.0 130 1.7757 0.2313
1.5332 40.92 133 1.7878 0.2313
1.5332 41.85 136 1.7966 0.2313
1.5332 42.77 139 1.8039 0.2313
1.5332 44.0 143 1.8047 0.2349
1.5332 44.92 146 1.8001 0.2491
1.5332 45.85 149 1.7924 0.2456
1.5332 46.77 152 1.7863 0.2562
1.5332 48.0 156 1.7770 0.2633
1.5332 48.92 159 1.7693 0.2705
1.5332 49.85 162 1.7656 0.2776
1.5332 50.77 165 1.7619 0.2918
1.5332 52.0 169 1.7609 0.3025
1.5332 52.92 172 1.7629 0.3060
1.5332 53.85 175 1.7646 0.3096
1.5332 54.77 178 1.7646 0.3132
1.5332 56.0 182 1.7650 0.3132
1.5332 56.92 185 1.7623 0.3238
1.5332 57.85 188 1.7614 0.3310
1.5332 58.77 191 1.7595 0.3345
1.5332 60.0 195 1.7589 0.3345
1.5332 60.92 198 1.7556 0.3381
1.2887 61.85 201 1.7556 0.3381
1.2887 62.77 204 1.7508 0.3416
1.2887 64.0 208 1.7468 0.3452
1.2887 64.92 211 1.7416 0.3452
1.2887 65.85 214 1.7356 0.3452
1.2887 66.77 217 1.7274 0.3559
1.2887 68.0 221 1.7196 0.3594
1.2887 68.92 224 1.7133 0.3630
1.2887 69.85 227 1.7103 0.3630
1.2887 70.77 230 1.7120 0.3630
1.2887 72.0 234 1.7099 0.3665
1.2887 72.92 237 1.7038 0.3701
1.2887 73.85 240 1.6975 0.3737
1.2887 74.77 243 1.6929 0.3772
1.2887 76.0 247 1.6884 0.3808
1.2887 76.92 250 1.6822 0.3879
1.2887 77.85 253 1.6749 0.3879
1.2887 78.77 256 1.6709 0.3915
1.2887 80.0 260 1.6645 0.3915
1.2887 80.92 263 1.6606 0.3915
1.2887 81.85 266 1.6586 0.3915
1.2887 82.77 269 1.6515 0.3915
1.2887 84.0 273 1.6471 0.3950
1.2887 84.92 276 1.6459 0.3950
1.2887 85.85 279 1.6428 0.3950
1.2887 86.77 282 1.6446 0.3950
1.2887 88.0 286 1.6454 0.3950
1.2887 88.92 289 1.6433 0.3950
1.2887 89.85 292 1.6395 0.3950
1.2887 90.77 295 1.6372 0.3950
1.2887 92.0 299 1.6350 0.3950
1.1159 92.92 302 1.6332 0.3986
1.1159 93.85 305 1.6306 0.3986
1.1159 94.77 308 1.6296 0.3986
1.1159 96.0 312 1.6273 0.3986
1.1159 96.92 315 1.6257 0.3986
1.1159 97.85 318 1.6229 0.4021
1.1159 98.77 321 1.6211 0.4021
1.1159 100.0 325 1.6199 0.4021
1.1159 100.92 328 1.6203 0.4021
1.1159 101.85 331 1.6201 0.4021
1.1159 102.77 334 1.6200 0.3986
1.1159 104.0 338 1.6153 0.4021
1.1159 104.92 341 1.6125 0.4057
1.1159 105.85 344 1.6099 0.4057
1.1159 106.77 347 1.6073 0.4057
1.1159 108.0 351 1.6028 0.4057
1.1159 108.92 354 1.6007 0.4057
1.1159 109.85 357 1.6002 0.4057
1.1159 110.77 360 1.6003 0.4057
1.1159 112.0 364 1.6025 0.4057
1.1159 112.92 367 1.6049 0.4021
1.1159 113.85 370 1.6071 0.4021
1.1159 114.77 373 1.6078 0.4021
1.1159 116.0 377 1.6086 0.4021
1.1159 116.92 380 1.6080 0.4021
1.1159 117.85 383 1.6063 0.4021
1.1159 118.77 386 1.6059 0.4021
1.1159 120.0 390 1.6057 0.4021
1.1159 120.92 393 1.6052 0.4021
1.1159 121.85 396 1.6048 0.4021
1.1159 122.77 399 1.6036 0.4021
1.0195 124.0 403 1.6036 0.4021
1.0195 124.92 406 1.6032 0.4021
1.0195 125.85 409 1.6019 0.4021
1.0195 126.77 412 1.6004 0.4021
1.0195 128.0 416 1.5979 0.4021
1.0195 128.92 419 1.5969 0.4021
1.0195 129.85 422 1.5966 0.4021
1.0195 130.77 425 1.5965 0.4021
1.0195 132.0 429 1.5959 0.4057
1.0195 132.92 432 1.5960 0.4057
1.0195 133.85 435 1.5960 0.4057
1.0195 134.77 438 1.5962 0.4057
1.0195 136.0 442 1.5966 0.4057
1.0195 136.92 445 1.5967 0.4057
1.0195 137.85 448 1.5967 0.4057
1.0195 138.46 450 1.5967 0.4057

Framework versions

  • Transformers 4.39.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
2
Safetensors
Model size
13M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for anderloh/wav2vec2-5Class-Validation-Mic