Edit model card

w2v-bert-odia_v1

This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2688
  • Wer: 0.1951

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 4
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
2.9674 0.0342 300 1.3305 0.7001
1.2476 0.0683 600 1.1660 0.5879
1.0692 0.1025 900 0.9110 0.4886
0.9443 0.1366 1200 0.7601 0.4727
0.8235 0.1708 1500 0.7761 0.3973
0.8155 0.2050 1800 0.7084 0.4022
0.767 0.2391 2100 0.6251 0.3756
0.7517 0.2733 2400 0.6125 0.3654
0.687 0.3075 2700 0.5848 0.3439
0.6509 0.3416 3000 0.5643 0.3282
0.6632 0.3758 3300 0.5509 0.3199
0.6108 0.4099 3600 0.5393 0.3341
0.5898 0.4441 3900 0.5223 0.3277
0.595 0.4783 4200 0.5199 0.3200
0.5644 0.5124 4500 0.5508 0.2919
0.5787 0.5466 4800 0.4994 0.3060
0.5752 0.5807 5100 0.4966 0.2997
0.5353 0.6149 5400 0.4731 0.3237
0.5473 0.6491 5700 0.4665 0.3062
0.5498 0.6832 6000 0.4890 0.2876
0.5146 0.7174 6300 0.4747 0.2926
0.5398 0.7516 6600 0.4581 0.2907
0.5154 0.7857 6900 0.4557 0.2995
0.5386 0.8199 7200 0.4515 0.2948
0.5037 0.8540 7500 0.4456 0.2961
0.5344 0.8882 7800 0.4509 0.2988
0.501 0.9224 8100 0.4436 0.2711
0.487 0.9565 8400 0.4233 0.2749
0.4692 0.9907 8700 0.4661 0.2532
0.462 1.0249 9000 0.4197 0.2723
0.4508 1.0590 9300 0.4316 0.2584
0.4702 1.0932 9600 0.4148 0.2689
0.4517 1.1273 9900 0.3950 0.2549
0.4408 1.1615 10200 0.4308 0.2551
0.4636 1.1957 10500 0.4033 0.2700
0.4583 1.2298 10800 0.4096 0.2556
0.4315 1.2640 11100 0.3883 0.2681
0.4172 1.2981 11400 0.3737 0.2529
0.4177 1.3323 11700 0.3992 0.2472
0.3975 1.3665 12000 0.3716 0.2485
0.4044 1.4006 12300 0.3853 0.2523
0.4497 1.4348 12600 0.3798 0.2465
0.4188 1.4690 12900 0.3822 0.2494
0.4424 1.5031 13200 0.3560 0.2449
0.4249 1.5373 13500 0.3630 0.2514
0.4287 1.5714 13800 0.3662 0.2417
0.3712 1.6056 14100 0.3714 0.2562
0.3893 1.6398 14400 0.3711 0.2333
0.3935 1.6739 14700 0.3715 0.2413
0.3982 1.7081 15000 0.3551 0.2482
0.4124 1.7422 15300 0.3519 0.2412
0.3853 1.7764 15600 0.3429 0.2418
0.4096 1.8106 15900 0.3407 0.2394
0.3816 1.8447 16200 0.3607 0.2370
0.3769 1.8789 16500 0.3601 0.2291
0.3428 1.9131 16800 0.3578 0.2283
0.3636 1.9472 17100 0.3485 0.2334
0.3594 1.9814 17400 0.3539 0.2341
0.3692 2.0155 17700 0.3383 0.2282
0.3295 2.0497 18000 0.3354 0.2374
0.3442 2.0839 18300 0.3393 0.2340
0.3306 2.1180 18600 0.3567 0.2382
0.3243 2.1522 18900 0.3410 0.2287
0.3426 2.1864 19200 0.3244 0.2323
0.3552 2.2205 19500 0.3356 0.2318
0.3558 2.2547 19800 0.3686 0.2225
0.3485 2.2888 20100 0.3485 0.2230
0.3195 2.3230 20400 0.3197 0.2230
0.3145 2.3572 20700 0.3312 0.2294
0.3238 2.3913 21000 0.3331 0.2210
0.3288 2.4255 21300 0.3172 0.2272
0.3398 2.4596 21600 0.3228 0.2182
0.3185 2.4940 21900 0.3057 0.2272
0.3152 2.5281 22200 0.3133 0.2175
0.312 2.5623 22500 0.3155 0.2155
0.3131 2.5965 22800 0.3087 0.2200
0.2993 2.6306 23100 0.3123 0.2216
0.2953 2.6648 23400 0.3116 0.2203
0.274 2.6989 23700 0.3221 0.2099
0.3043 2.7331 24000 0.3092 0.2131
0.2939 2.7673 24300 0.3084 0.2134
0.3063 2.8014 24600 0.3119 0.2094
0.3108 2.8356 24900 0.2987 0.2104
0.3188 2.8698 25200 0.3030 0.2082
0.2921 2.9039 25500 0.3051 0.2090
0.2994 2.9381 25800 0.2939 0.2148
0.2789 2.9722 26100 0.3012 0.2068
0.2902 3.0064 26400 0.2981 0.2138
0.2899 3.0406 26700 0.2931 0.2062
0.2796 3.0747 27000 0.2953 0.2067
0.287 3.1089 27300 0.3006 0.2105
0.2828 3.1431 27600 0.2916 0.2121
0.2798 3.1772 27900 0.2974 0.2060
0.2757 3.2114 28200 0.2908 0.2042
0.2694 3.2455 28500 0.2905 0.2058
0.262 3.2797 28800 0.2866 0.2048
0.2623 3.3139 29100 0.2794 0.2062
0.282 3.3480 29400 0.2814 0.2004
0.2655 3.3822 29700 0.2891 0.2006
0.2757 3.4163 30000 0.2845 0.1983
0.2686 3.4505 30300 0.2818 0.2013
0.2571 3.4847 30600 0.2825 0.2003
0.2681 3.5188 30900 0.2814 0.2051
0.2628 3.5530 31200 0.2831 0.1998
0.2625 3.5872 31500 0.2775 0.2032
0.2448 3.6213 31800 0.2770 0.1984
0.2599 3.6555 32100 0.2732 0.2002
0.2492 3.6896 32400 0.2880 0.1942
0.2666 3.7238 32700 0.2701 0.1984
0.257 3.7580 33000 0.2687 0.1997
0.2589 3.7921 33300 0.2665 0.1997
0.2735 3.8263 33600 0.2678 0.1990
0.2477 3.8604 33900 0.2704 0.1958
0.2525 3.8946 34200 0.2695 0.1946
0.2401 3.9288 34500 0.2732 0.1931
0.2585 3.9629 34800 0.2682 0.1945
0.2582 3.9971 35100 0.2688 0.1951

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.1.2+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
10
Safetensors
Model size
606M params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for cdactvm/w2v-bert-odia_v1

Finetuned
(183)
this model

Space using cdactvm/w2v-bert-odia_v1 1