Edit model card

w2v-bert-2.0-lg-cv-1hr-v2

This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.8417
  • Model Preparation Time: 0.0129
  • Wer: 0.9997
  • Cer: 0.9914

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.01
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
15.3055 0.9859 35 12.2381 0.0129 1.0 1.0
9.6208 2.0 71 8.3440 0.0129 1.0 1.0
8.5028 2.9859 106 7.9784 0.0129 1.0 1.0
7.9601 4.0 142 7.7040 0.0129 1.0 1.0
7.9111 4.9859 177 7.4474 0.0129 1.0 1.0
7.4259 6.0 213 7.1874 0.0129 1.0 1.0
7.3711 6.9859 248 6.9404 0.0129 1.0 1.0
6.9121 8.0 284 6.6929 0.0129 1.0 1.0
6.8465 8.9859 319 6.4528 0.0129 1.0 1.0
6.4091 10.0 355 6.2112 0.0129 1.0 1.0
6.3427 10.9859 390 5.9794 0.0129 1.0 1.0
5.9281 12.0 426 5.7489 0.0129 1.0 1.0
5.861 12.9859 461 5.5291 0.0129 1.0 1.0
5.4728 14.0 497 5.3136 0.0129 1.0 1.0
5.4055 14.9859 532 5.1116 0.0129 1.0 1.0
5.05 16.0 568 4.9106 0.0129 1.0 1.0
4.9891 16.9859 603 4.7271 0.0129 1.0 1.0
4.6647 18.0 639 4.5480 0.0129 1.0 1.0
4.6156 18.9859 674 4.3846 0.0129 1.0 1.0
4.3257 20.0 710 4.2293 0.0129 1.0 1.0
4.2913 20.9859 745 4.0908 0.0129 1.0 1.0
4.0311 22.0 781 3.9577 0.0129 1.0 1.0
4.0132 22.9859 816 3.8405 0.0129 1.0 1.0
3.7827 24.0 852 3.7315 0.0129 1.0 1.0
3.7818 24.9859 887 3.6348 0.0129 1.0 1.0
3.581 26.0 923 3.5459 0.0129 1.0 1.0
3.5949 26.9859 958 3.4699 0.0129 1.0 1.0
3.4195 28.0 994 3.3998 0.0129 1.0 1.0
3.4464 28.9859 1029 3.3396 0.0129 1.0 1.0
3.2914 30.0 1065 3.2848 0.0129 1.0 1.0
3.3323 30.9859 1100 3.2404 0.0129 1.0 1.0
3.1943 32.0 1136 3.1985 0.0129 1.0 1.0
3.2449 32.9859 1171 3.1625 0.0129 1.0 1.0
3.1197 34.0 1207 3.1302 0.0129 1.0 1.0
3.1765 34.9859 1242 3.1066 0.0129 1.0 1.0
3.0618 36.0 1278 3.0819 0.0129 1.0 1.0
3.1256 36.9859 1313 3.0686 0.0129 1.0 1.0
3.0218 38.0 1349 3.0477 0.0129 1.0 1.0
3.09 38.9859 1384 3.0354 0.0129 1.0 1.0
2.9895 40.0 1420 3.0255 0.0129 1.0 1.0
3.0632 40.9859 1455 3.0127 0.0129 1.0 1.0
2.9671 42.0 1491 3.0028 0.0129 1.0 1.0
3.0415 42.9859 1526 2.9959 0.0129 1.0 1.0
2.9499 44.0 1562 2.9881 0.0129 1.0 1.0
3.0269 44.9859 1597 2.9858 0.0129 1.0 1.0
2.9369 46.0 1633 2.9776 0.0129 1.0 1.0
3.0154 46.9859 1668 2.9727 0.0129 1.0 1.0
2.9269 48.0 1704 2.9696 0.0129 1.0 1.0
3.0057 48.9859 1739 2.9655 0.0129 1.0 1.0
2.9185 50.0 1775 2.9613 0.0129 1.0 1.0
2.9982 50.9859 1810 2.9593 0.0129 1.0 1.0
2.9112 52.0 1846 2.9555 0.0129 1.0 1.0
2.9912 52.9859 1881 2.9532 0.0129 1.0 1.0
2.9047 54.0 1917 2.9496 0.0129 1.0 1.0
2.9844 54.9859 1952 2.9486 0.0129 1.0 1.0
2.8984 56.0 1988 2.9454 0.0129 1.0 1.0
2.9786 56.9859 2023 2.9435 0.0129 1.0 1.0
2.8928 58.0 2059 2.9391 0.0129 1.0 1.0
2.9716 58.9859 2094 2.9357 0.0129 1.0 1.0
2.8834 60.0 2130 2.9296 0.0129 1.0 1.0
2.9603 60.9859 2165 2.9241 0.0129 1.0 1.0
2.87 62.0 2201 2.9152 0.0129 1.0 1.0
2.9421 62.9859 2236 2.9050 0.0129 1.0 1.0
2.8491 64.0 2272 2.8932 0.0129 1.0 1.0
2.9179 64.9859 2307 2.8783 0.0129 1.0 1.0
2.8239 66.0 2343 2.8657 0.0129 1.0 0.9974
2.8902 66.9859 2378 2.8543 0.0129 1.0 0.9963
2.7972 68.0 2414 2.8407 0.0129 1.0 0.9955
2.8628 68.9859 2449 2.8276 0.0129 1.0 0.9936
2.7694 70.0 2485 2.8108 0.0129 1.0 0.9945
2.831 70.9859 2520 2.7947 0.0129 0.9996 0.9919
2.735 72.0 2556 2.7773 0.0129 0.9998 0.9888
2.7981 72.9859 2591 2.7636 0.0129 0.9998 0.9870
2.7062 74.0 2627 2.7507 0.0129 0.9998 0.9846
2.7699 74.9859 2662 2.7373 0.0129 0.9998 0.9849
2.6797 76.0 2698 2.7237 0.0129 0.9996 0.9818
2.7434 76.9859 2733 2.7133 0.0129 1.0 0.9806
2.6558 78.0 2769 2.7024 0.0129 0.9996 0.9779
2.7204 78.9859 2804 2.6910 0.0129 0.9998 0.9763
2.6344 80.0 2840 2.6817 0.0129 0.9998 0.9727
2.7002 80.9859 2875 2.6726 0.0129 0.9998 0.9690
2.6166 82.0 2911 2.6645 0.0129 0.9998 0.9655
2.6827 82.9859 2946 2.6571 0.0129 1.0 0.9599
2.6014 84.0 2982 2.6503 0.0129 1.0 0.9549
2.6693 84.9859 3017 2.6444 0.0129 1.0 0.9497
2.5889 86.0 3053 2.6391 0.0129 1.0 0.9434
2.6577 86.9859 3088 2.6350 0.0129 1.0 0.9354
2.5795 88.0 3124 2.6305 0.0129 1.0 0.9290
2.6494 88.9859 3159 2.6275 0.0129 1.0 0.9249
2.5731 90.0 3195 2.6248 0.0129 1.0 0.9217
2.6435 90.9859 3230 2.6222 0.0129 1.0 0.9140
2.5678 92.0 3266 2.6206 0.0129 1.0 0.9128
2.6399 92.9859 3301 2.6193 0.0129 1.0 0.9088
2.5653 94.0 3337 2.6183 0.0129 1.0 0.9070
2.6379 94.9859 3372 2.6177 0.0129 1.0 0.9043
2.5642 96.0 3408 2.6175 0.0129 1.0 0.9052
2.6369 96.9859 3443 2.6173 0.0129 1.0 0.9040
2.5639 98.0 3479 2.6173 0.0129 1.0 0.9043
2.5974 98.5915 3500 2.6173 0.0129 1.0 0.9044

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
581M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for KasuleTrevor/w2v-bert-2.0-lg-cv-1hr-v2

Finetuned
(183)
this model