ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1-5gram
This model is a fine-tuned version of gary109/ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1-5gram on the GARY109/AI_LIGHT_DANCE - ONSET-SINGING3 dataset. It achieves the following results on the evaluation set:
- Loss: 0.4505
- Wer: 0.2119
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-07
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 100.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.3355 | 1.0 | 144 | 0.4505 | 0.2119 |
0.3069 | 2.0 | 288 | 0.4509 | 0.2124 |
0.3049 | 3.0 | 432 | 0.4511 | 0.2119 |
0.3028 | 4.0 | 576 | 0.4521 | 0.2114 |
0.3092 | 5.0 | 720 | 0.4532 | 0.2112 |
0.3043 | 6.0 | 864 | 0.4536 | 0.2117 |
0.2903 | 7.0 | 1008 | 0.4543 | 0.2114 |
0.3124 | 8.0 | 1152 | 0.4538 | 0.2118 |
0.3079 | 9.0 | 1296 | 0.4541 | 0.2121 |
0.3093 | 10.0 | 1440 | 0.4537 | 0.2117 |
0.3093 | 11.0 | 1584 | 0.4544 | 0.2111 |
0.3202 | 12.0 | 1728 | 0.4549 | 0.2110 |
0.3086 | 13.0 | 1872 | 0.4546 | 0.2104 |
0.2947 | 14.0 | 2016 | 0.4542 | 0.2119 |
0.3145 | 15.0 | 2160 | 0.4539 | 0.2115 |
0.3292 | 16.0 | 2304 | 0.4532 | 0.2115 |
0.3049 | 17.0 | 2448 | 0.4547 | 0.2117 |
0.3177 | 18.0 | 2592 | 0.4544 | 0.2111 |
0.3108 | 19.0 | 2736 | 0.4547 | 0.2114 |
0.2944 | 20.0 | 2880 | 0.4560 | 0.2105 |
0.3232 | 21.0 | 3024 | 0.4560 | 0.2113 |
0.3196 | 22.0 | 3168 | 0.4559 | 0.2107 |
0.3207 | 23.0 | 3312 | 0.4563 | 0.2106 |
0.3039 | 24.0 | 3456 | 0.4555 | 0.2110 |
0.3157 | 25.0 | 3600 | 0.4560 | 0.2117 |
0.3285 | 26.0 | 3744 | 0.4561 | 0.2102 |
0.3125 | 27.0 | 3888 | 0.4553 | 0.2107 |
0.3051 | 28.0 | 4032 | 0.4560 | 0.2103 |
0.3166 | 29.0 | 4176 | 0.4560 | 0.2103 |
0.321 | 30.0 | 4320 | 0.4551 | 0.2101 |
0.3146 | 31.0 | 4464 | 0.4552 | 0.2100 |
0.323 | 32.0 | 4608 | 0.4551 | 0.2105 |
0.3223 | 33.0 | 4752 | 0.4554 | 0.2101 |
0.3105 | 34.0 | 4896 | 0.4549 | 0.2102 |
0.3134 | 35.0 | 5040 | 0.4552 | 0.2101 |
0.3054 | 36.0 | 5184 | 0.4550 | 0.2103 |
0.3162 | 37.0 | 5328 | 0.4554 | 0.2106 |
0.3094 | 38.0 | 5472 | 0.4551 | 0.2099 |
0.3174 | 39.0 | 5616 | 0.4553 | 0.2105 |
0.3218 | 40.0 | 5760 | 0.4553 | 0.2106 |
0.3134 | 41.0 | 5904 | 0.4552 | 0.2101 |
0.3019 | 42.0 | 6048 | 0.4552 | 0.2101 |
0.3169 | 43.0 | 6192 | 0.4552 | 0.2095 |
0.3209 | 44.0 | 6336 | 0.4550 | 0.2090 |
0.3035 | 45.0 | 6480 | 0.4550 | 0.2100 |
0.3181 | 46.0 | 6624 | 0.4550 | 0.2104 |
0.3133 | 47.0 | 6768 | 0.4546 | 0.2096 |
0.3173 | 48.0 | 6912 | 0.4556 | 0.2099 |
0.3174 | 49.0 | 7056 | 0.4552 | 0.2101 |
0.313 | 50.0 | 7200 | 0.4553 | 0.2100 |
0.3139 | 51.0 | 7344 | 0.4555 | 0.2101 |
0.3054 | 52.0 | 7488 | 0.4555 | 0.2100 |
0.3212 | 53.0 | 7632 | 0.4554 | 0.2097 |
0.3252 | 54.0 | 7776 | 0.4553 | 0.2097 |
0.3063 | 55.0 | 7920 | 0.4554 | 0.2106 |
0.3206 | 56.0 | 8064 | 0.4551 | 0.2097 |
0.3176 | 57.0 | 8208 | 0.4552 | 0.2101 |
0.3179 | 58.0 | 8352 | 0.4554 | 0.2099 |
0.3064 | 59.0 | 8496 | 0.4559 | 0.2092 |
0.301 | 60.0 | 8640 | 0.4559 | 0.2103 |
0.3103 | 61.0 | 8784 | 0.4559 | 0.2102 |
0.3169 | 62.0 | 8928 | 0.4559 | 0.2103 |
0.3081 | 63.0 | 9072 | 0.4559 | 0.2101 |
0.3249 | 64.0 | 9216 | 0.4555 | 0.2106 |
0.3031 | 65.0 | 9360 | 0.4553 | 0.2105 |
0.3017 | 66.0 | 9504 | 0.4556 | 0.2105 |
0.3261 | 67.0 | 9648 | 0.4551 | 0.2100 |
0.3196 | 68.0 | 9792 | 0.4553 | 0.2096 |
0.3085 | 69.0 | 9936 | 0.4554 | 0.2095 |
0.3235 | 70.0 | 10080 | 0.4552 | 0.2096 |
0.3194 | 71.0 | 10224 | 0.4550 | 0.2102 |
0.3243 | 72.0 | 10368 | 0.4546 | 0.2098 |
0.3115 | 73.0 | 10512 | 0.4542 | 0.2101 |
0.3307 | 74.0 | 10656 | 0.4545 | 0.2100 |
0.3072 | 75.0 | 10800 | 0.4547 | 0.2100 |
0.3218 | 76.0 | 10944 | 0.4545 | 0.2102 |
0.3116 | 77.0 | 11088 | 0.4540 | 0.2103 |
0.3021 | 78.0 | 11232 | 0.4542 | 0.2101 |
0.3165 | 79.0 | 11376 | 0.4539 | 0.2109 |
0.327 | 80.0 | 11520 | 0.4539 | 0.2090 |
0.3268 | 81.0 | 11664 | 0.4540 | 0.2110 |
0.304 | 82.0 | 11808 | 0.4537 | 0.2097 |
0.3256 | 83.0 | 11952 | 0.4537 | 0.2102 |
0.3208 | 84.0 | 12096 | 0.4544 | 0.2101 |
0.3199 | 85.0 | 12240 | 0.4541 | 0.2094 |
0.3104 | 86.0 | 12384 | 0.4543 | 0.2097 |
0.3218 | 87.0 | 12528 | 0.4542 | 0.2106 |
0.3301 | 88.0 | 12672 | 0.4538 | 0.2098 |
0.3055 | 89.0 | 12816 | 0.4540 | 0.2101 |
0.3154 | 90.0 | 12960 | 0.4533 | 0.2098 |
0.3169 | 91.0 | 13104 | 0.4543 | 0.2098 |
0.3122 | 92.0 | 13248 | 0.4541 | 0.2098 |
0.319 | 93.0 | 13392 | 0.4536 | 0.2094 |
0.307 | 94.0 | 13536 | 0.4538 | 0.2092 |
0.3132 | 95.0 | 13680 | 0.4540 | 0.2094 |
0.3185 | 96.0 | 13824 | 0.4536 | 0.2099 |
0.2996 | 97.0 | 13968 | 0.4541 | 0.2100 |
0.3193 | 98.0 | 14112 | 0.4539 | 0.2092 |
0.3091 | 99.0 | 14256 | 0.4538 | 0.2096 |
0.315 | 100.0 | 14400 | 0.4544 | 0.2100 |
Framework versions
- Transformers 4.21.0.dev0
- Pytorch 1.9.1+cu102
- Datasets 2.3.3.dev0
- Tokenizers 0.12.1
- Downloads last month
- 38
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.