Edit model card

ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1

This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1 on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-MDB-ENST2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5661
  • Wer: 0.3416

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 4
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.2764 1.0 141 0.7125 0.3619
0.5415 2.0 282 0.7252 0.3682
0.3324 3.0 423 0.6779 0.3728
0.4244 4.0 564 0.7403 0.3737
0.5234 5.0 705 0.8086 0.3534
0.3339 6.0 846 0.7187 0.3619
0.5016 7.0 987 0.8582 0.3602
0.3376 8.0 1128 0.8801 0.3674
0.3507 9.0 1269 0.8524 0.3560
0.4844 10.0 1410 0.7152 0.3648
0.4282 11.0 1551 0.6719 0.3475
0.4398 12.0 1692 0.7130 0.3686
0.331 13.0 1833 0.6425 0.3627
0.4488 14.0 1974 0.6483 0.3648
0.3876 15.0 2115 0.6375 0.3509
0.3361 16.0 2256 0.6791 0.3703
0.344 17.0 2397 0.7279 0.3551
0.3198 18.0 2538 0.6801 0.3509
0.2753 19.0 2679 0.6239 0.3509
0.2962 20.0 2820 0.7419 0.3442
0.7503 21.0 2961 0.7279 0.3501
0.4013 22.0 3102 0.6899 0.3792
0.5134 23.0 3243 0.6572 0.3787
0.3144 24.0 3384 0.5882 0.3543
0.3534 25.0 3525 0.5661 0.3416
0.2555 26.0 3666 0.5977 0.3589
0.3524 27.0 3807 0.5953 0.3585
0.314 28.0 3948 0.6359 0.3593
0.2565 29.0 4089 0.6192 0.3615
0.5023 30.0 4230 0.6229 0.3378
0.3025 31.0 4371 0.6002 0.3442
0.3329 32.0 4512 0.6235 0.3513
0.3744 33.0 4653 0.5782 0.3416
0.2899 34.0 4794 0.5835 0.3336
0.306 35.0 4935 0.6061 0.3496
0.2519 36.0 5076 0.5958 0.3652
0.3201 37.0 5217 0.5778 0.3652
0.3011 38.0 5358 0.6238 0.3589
0.2882 39.0 5499 0.6501 0.3361
0.2542 40.0 5640 0.6341 0.3488
0.2717 41.0 5781 0.5890 0.3530
0.3197 42.0 5922 0.5877 0.3471
0.2816 43.0 6063 0.6614 0.3420
0.3301 44.0 6204 0.6334 0.3475
0.2466 45.0 6345 0.6663 0.3429
0.2908 46.0 6486 0.5941 0.3475
0.2785 47.0 6627 0.6337 0.3568
0.2361 48.0 6768 0.5845 0.3399
0.4729 49.0 6909 0.6466 0.3425
0.5103 50.0 7050 0.7112 0.3416
0.2676 51.0 7191 0.6260 0.3307
0.3533 52.0 7332 0.7327 0.3454
0.3308 53.0 7473 0.7150 0.3277
0.2617 54.0 7614 0.6412 0.3391
0.2901 55.0 7755 0.6225 0.3391
0.2847 56.0 7896 0.7385 0.3391
0.2621 57.0 8037 0.7241 0.3496
0.2477 58.0 8178 0.6957 0.3429
0.3147 59.0 8319 0.6808 0.3425
0.3761 60.0 8460 0.6710 0.3450
0.2609 61.0 8601 0.6629 0.3345
0.388 62.0 8742 0.6688 0.3463
0.3684 63.0 8883 0.7018 0.3340
0.2494 64.0 9024 0.6611 0.3399
0.2641 65.0 9165 0.6828 0.3399
0.2716 66.0 9306 0.6409 0.3294
0.2595 67.0 9447 0.6056 0.3231
0.2683 68.0 9588 0.6203 0.3332
0.2571 69.0 9729 0.6484 0.3336
0.2593 70.0 9870 0.6597 0.3294
0.229 71.0 10011 0.6354 0.3235
0.281 72.0 10152 0.6398 0.3294
0.3779 73.0 10293 0.6871 0.3345
0.2998 74.0 10434 0.7329 0.3323
0.2095 75.0 10575 0.7365 0.3239
0.247 76.0 10716 0.6384 0.3290
0.2095 77.0 10857 0.6703 0.3345
0.2074 78.0 10998 0.6577 0.3425
0.2519 79.0 11139 0.6359 0.3370
0.2046 80.0 11280 0.6222 0.3256
1.3195 81.0 11421 0.6126 0.3345
0.2821 82.0 11562 0.6193 0.3294
0.3256 83.0 11703 0.6140 0.3336
0.2743 84.0 11844 0.6204 0.3290
0.2761 85.0 11985 0.6599 0.3252
0.224 86.0 12126 0.6580 0.3294
0.2106 87.0 12267 0.6298 0.3294
0.2706 88.0 12408 0.6411 0.3281
0.2523 89.0 12549 0.6243 0.3264
0.3635 90.0 12690 0.6297 0.3290
0.353 91.0 12831 0.6145 0.3235
0.2491 92.0 12972 0.6296 0.3197
0.1999 93.0 13113 0.6329 0.3222
0.2417 94.0 13254 0.6200 0.3222
0.2397 95.0 13395 0.6137 0.3269
0.2275 96.0 13536 0.6237 0.3277
0.207 97.0 13677 0.6230 0.3235
0.2704 98.0 13818 0.6239 0.3281
0.2119 99.0 13959 0.6224 0.3277
0.2561 100.0 14100 0.6187 0.3269

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.8.1+cu111
  • Datasets 2.7.1.dev0
  • Tokenizers 0.13.2
Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.