Edit model card

bertweet-2020-Q1-filtered

This model is a fine-tuned version of vinai/bertweet-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6866

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1400
  • training_steps: 2400000

Training results

Training Loss Epoch Step Validation Loss
No log 0.07 8000 2.6350
2.7848 0.13 16000 2.6556
2.7848 0.2 24000 2.6695
2.7545 0.26 32000 2.6713
2.7545 0.33 40000 2.7089
2.7717 0.39 48000 2.7144
2.7717 0.46 56000 2.7240
2.8043 0.52 64000 2.7499
2.8043 0.59 72000 2.7704
2.8401 0.65 80000 2.7820
2.8401 0.72 88000 2.8068
2.8723 0.78 96000 2.8150
2.8723 0.85 104000 2.8410
2.9004 0.91 112000 2.8657
2.9004 0.98 120000 2.8826
2.9396 1.04 128000 2.9071
2.9396 1.11 136000 2.9490
2.9801 1.17 144000 2.9515
2.9801 1.24 152000 2.9863
3.0173 1.3 160000 2.9916
3.0173 1.37 168000 3.0231
3.0674 1.44 176000 3.0447
3.0674 1.5 184000 3.0638
3.1059 1.57 192000 3.0945
3.1059 1.63 200000 3.1008
3.1283 1.7 208000 3.1257
3.1283 1.76 216000 3.1262
3.1684 1.83 224000 3.1523
3.1684 1.89 232000 3.1842
3.1966 1.96 240000 3.1820
3.1966 2.02 248000 3.1976
3.2055 2.09 256000 3.2013
3.2055 2.15 264000 3.2197
3.2186 2.22 272000 3.2259
3.2186 2.28 280000 3.2410
3.2518 2.35 288000 3.2449
3.2518 2.41 296000 3.2686
3.2705 2.48 304000 3.2702
3.2705 2.54 312000 3.2716
3.2677 2.61 320000 3.2935
3.2677 2.67 328000 3.2942
3.2955 2.74 336000 3.3044
3.2955 2.8 344000 3.3110
3.2966 2.87 352000 3.3053
3.2966 2.94 360000 3.3276
3.311 3.0 368000 3.3256
3.311 3.07 376000 3.3292
3.3217 3.13 384000 3.3335
3.3217 3.2 392000 3.3160
3.3145 3.26 400000 3.3378
3.3145 3.33 408000 3.3307
3.3246 3.39 416000 3.3427
3.3246 3.46 424000 3.3543
3.3131 3.52 432000 3.3405
3.3131 3.59 440000 3.3361
3.3266 3.65 448000 3.3704
3.3266 3.72 456000 3.3549
3.3358 3.78 464000 3.3603
3.3358 3.85 472000 3.3642
3.3385 3.91 480000 3.3573
3.3385 3.98 488000 3.3658
3.3375 4.04 496000 3.3459
3.3375 4.11 504000 3.3703
3.3237 4.17 512000 3.3564
3.3237 4.24 520000 3.3553
3.34 4.31 528000 3.3576
3.34 4.37 536000 3.3548
3.3247 4.44 544000 3.3526
3.3247 4.5 552000 3.3674
3.318 4.57 560000 3.3608
3.318 4.63 568000 3.3527
3.3318 4.7 576000 3.3600
3.3318 4.76 584000 3.3662
3.3211 4.83 592000 3.3603
3.3211 4.89 600000 3.3640
3.3344 4.96 608000 3.3760
3.3344 5.02 616000 3.3876
3.331 5.09 624000 3.3519
3.331 5.15 632000 3.3734
3.3293 5.22 640000 3.3735
3.3293 5.28 648000 3.3703
3.3317 5.35 656000 3.3826
3.3317 5.41 664000 3.3826
3.3291 5.48 672000 3.3919
3.3291 5.54 680000 3.3786
3.3423 5.61 688000 3.3775
3.3423 5.68 696000 3.3734
3.3364 5.74 704000 3.3725
3.3364 5.81 712000 3.3855
3.347 5.87 720000 3.3774
3.347 5.94 728000 3.3717
3.3311 6.0 736000 3.3929
3.3311 6.07 744000 3.3899
3.3445 6.13 752000 3.3985
3.3445 6.2 760000 3.3866
3.345 6.26 768000 3.3943
3.345 6.33 776000 3.3734
3.3427 6.39 784000 3.3832
3.3427 6.46 792000 3.3966
3.3406 6.52 800000 3.3892
3.3406 6.59 808000 3.3904
3.3406 6.65 816000 3.3867
3.3406 6.72 824000 3.3902
3.3354 6.78 832000 3.3718
3.3354 6.85 840000 3.3831
3.3521 6.91 848000 3.3909
3.3521 6.98 856000 3.3799
3.3538 7.05 864000 3.3828
3.3538 7.11 872000 3.3785
3.3363 7.18 880000 3.3993
3.3363 7.24 888000 3.3850
3.3341 7.31 896000 3.3932
3.3341 7.37 904000 3.3981
3.3458 7.44 912000 3.3936
3.3458 7.5 920000 3.4032
3.3327 7.57 928000 3.3852
3.3327 7.63 936000 3.3865
3.3507 7.7 944000 3.3900
3.3507 7.76 952000 3.3772
3.3493 7.83 960000 3.3887
3.3493 7.89 968000 3.3951
3.3412 7.96 976000 3.3833
3.3412 8.02 984000 3.3816
3.3232 8.09 992000 3.3752
3.3232 8.15 1000000 3.3845
3.333 8.22 1008000 3.3907
3.333 8.28 1016000 3.3823
3.3449 8.35 1024000 3.3725
3.3449 8.41 1032000 3.3797
3.3336 8.48 1040000 3.3878
3.3336 8.55 1048000 3.3845
3.3307 8.61 1056000 3.3907
3.3307 8.68 1064000 3.3858
3.3267 8.74 1072000 3.3952
3.3267 8.81 1080000 3.3914
3.335 8.87 1088000 3.3904
3.335 8.94 1096000 3.3895
3.3411 9.0 1104000 3.3959
3.3411 9.07 1112000 3.3915
3.3324 9.13 1120000 3.4030
3.3324 9.2 1128000 3.4084
3.3297 9.26 1136000 3.4023
3.3297 9.33 1144000 3.3967
3.3492 9.39 1152000 3.3931
3.3492 9.46 1160000 3.4065
3.3317 9.52 1168000 3.3905
3.3317 9.59 1176000 3.4021
3.3447 9.65 1184000 3.4001
3.3447 9.72 1192000 3.3943
3.3377 9.78 1200000 3.3971
3.3377 9.85 1208000 3.3946
3.3486 9.92 1216000 3.3924
3.3486 9.98 1224000 3.3983
3.3471 10.05 1232000 3.4141
3.3471 10.11 1240000 3.4220
3.3457 10.18 1248000 3.4085
3.3457 10.24 1256000 3.4243
3.3278 10.31 1264000 3.4058
3.3278 10.37 1272000 3.4033
3.325 10.44 1280000 3.3867
3.325 10.5 1288000 3.3879
3.3248 10.57 1296000 3.3801
3.3248 10.63 1304000 3.4027
3.3217 10.7 1312000 3.3781
3.3217 10.76 1320000 3.3871
3.3227 10.83 1328000 3.3861
3.3227 10.89 1336000 3.3789
3.3259 10.96 1344000 3.3865
3.3259 11.02 1352000 3.3863
3.3094 11.09 1360000 3.3827
3.3094 11.15 1368000 3.3880
3.3128 11.22 1376000 3.3652
3.3128 11.29 1384000 3.3813
3.3088 11.35 1392000 3.3853
3.3088 11.42 1400000 3.3709
3.3067 11.48 1408000 3.3831
3.3067 11.55 1416000 3.3703
3.311 11.61 1424000 3.3696
3.311 11.68 1432000 3.3769
3.3048 11.74 1440000 3.3740
3.3048 11.81 1448000 3.3731
3.3055 11.87 1456000 3.3655
3.3055 11.94 1464000 3.3697
3.3105 12.0 1472000 3.3742
3.3105 12.07 1480000 3.3614
3.2977 12.13 1488000 3.3705
3.2977 12.2 1496000 3.3746
3.2999 12.26 1504000 3.3691
3.2999 12.33 1512000 3.3745
3.2983 12.39 1520000 3.3717
3.2983 12.46 1528000 3.3682
3.2957 12.52 1536000 3.3693
3.2957 12.59 1544000 3.3764
3.293 12.65 1552000 3.3691
3.293 12.72 1560000 3.3802
3.2919 12.79 1568000 3.3626
3.2919 12.85 1576000 3.3604
3.3023 12.92 1584000 3.3749
3.3023 12.98 1592000 3.3688
3.2988 13.05 1600000 3.3666
3.2988 13.11 1608000 3.3695
3.2924 13.18 1616000 3.3650
3.2924 13.24 1624000 3.3651
3.2958 13.31 1632000 3.3692
3.2958 13.37 1640000 3.3855
3.2918 13.44 1648000 3.3706
3.2918 13.5 1656000 3.3680
3.2948 13.57 1664000 3.3534
3.2948 13.63 1672000 3.3699
3.2996 13.7 1680000 3.3733
3.2996 13.76 1688000 3.3764
3.2999 13.83 1696000 3.3793
3.2999 13.89 1704000 3.3683
3.291 13.96 1712000 3.3654
3.291 14.02 1720000 3.3721
3.2952 14.09 1728000 3.3674
3.2952 14.16 1736000 3.3762
3.2866 14.22 1744000 3.3699
3.2866 14.29 1752000 3.3690
3.2825 14.35 1760000 3.3653
3.2825 14.42 1768000 3.3687
3.2825 14.48 1776000 3.3618
3.2825 14.55 1784000 3.3609
3.2744 14.61 1792000 3.3552
3.2744 14.68 1800000 3.3549
3.2811 14.74 1808000 3.3504
3.2811 14.81 1816000 3.3575
3.2672 14.87 1824000 3.3588
3.2672 14.94 1832000 3.3560
3.2919 15.0 1840000 3.3598
3.2919 15.07 1848000 3.3445
3.2724 15.13 1856000 3.3517
3.2724 15.2 1864000 3.3593
3.277 15.26 1872000 3.3598
3.277 15.33 1880000 3.3458
3.2842 15.39 1888000 3.3583
3.2842 15.46 1896000 3.3448
3.2758 15.53 1904000 3.3593
3.2758 15.59 1912000 3.3552
3.2684 15.66 1920000 3.3715
3.2684 15.72 1928000 3.3544
3.2924 15.79 1936000 3.3515
3.2924 15.85 1944000 3.3646
3.2673 15.92 1952000 3.3538
3.2673 15.98 1960000 3.3437
3.2833 16.05 1968000 3.3443
3.2833 16.11 1976000 3.3619
3.2636 16.18 1984000 3.3511
3.2636 16.24 1992000 3.3448
3.2753 16.31 2000000 3.3560
3.2753 16.37 2008000 3.3525
3.2701 16.44 2016000 3.3558
3.2701 16.5 2024000 3.3559
3.2761 16.57 2032000 3.3440
3.2761 16.63 2040000 3.3506
3.2677 16.7 2048000 3.3474
3.2677 16.76 2056000 3.3615
3.2614 16.83 2064000 3.3507
3.2614 16.89 2072000 3.3444
3.2608 16.96 2080000 3.3527
3.2608 17.03 2088000 3.3398
3.2643 17.09 2096000 3.3498
3.2643 17.16 2104000 3.3349
3.2721 17.22 2112000 3.3560
3.2721 17.29 2120000 3.3421
3.266 17.35 2128000 3.3429
3.266 17.42 2136000 3.3371
3.2551 17.48 2144000 3.3404
3.2551 17.55 2152000 3.3494
3.26 17.61 2160000 3.3389
3.26 17.68 2168000 3.3456
3.2528 17.74 2176000 3.3249
3.2528 17.81 2184000 3.3452
3.2602 17.87 2192000 3.3376
3.2602 17.94 2200000 3.3511
3.2492 18.0 2208000 3.3475
3.2492 18.07 2216000 3.3497
3.2469 18.13 2224000 3.3378
3.2469 18.2 2232000 3.3326
3.2589 18.26 2240000 3.3277
3.2589 18.33 2248000 3.3457
3.2548 18.4 2256000 3.3343
3.2548 18.46 2264000 3.3362
3.2589 18.53 2272000 3.3431
3.2589 18.59 2280000 3.3428
3.2674 18.66 2288000 3.3401
3.2674 18.72 2296000 3.3375
3.2561 18.79 2304000 3.3334
3.2561 18.85 2312000 3.3321
3.2452 18.92 2320000 3.3446
3.2452 18.98 2328000 3.3525
3.259 19.05 2336000 3.3318
3.259 19.11 2344000 3.3452
3.2494 19.18 2352000 3.3355
3.2494 19.24 2360000 3.3322
3.2558 19.31 2368000 3.3255
3.2558 19.37 2376000 3.3330
3.2436 19.44 2384000 3.3358
3.2436 19.5 2392000 3.3287
3.2545 19.57 2400000 3.3321

Framework versions

  • Transformers 4.35.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.14.0
Downloads last month
14
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for DouglasPontes/bertweet-2020-Q1-filtered

Finetuned
(98)
this model