Edit model card

Visualize in Weights & Biases Visualize in Weights & Biases Visualize in Weights & Biases

PhoBert_Lexical_Meta_Dataset59KBoDuoi

This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5530
  • Accuracy: 0.9002
  • F1: 0.9006

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.2558 200 0.3462 0.8484 0.8450
No log 0.5115 400 0.3203 0.8626 0.8645
No log 0.7673 600 0.2886 0.8742 0.8747
0.3511 1.0230 800 0.2991 0.8824 0.8817
0.3511 1.2788 1000 0.2882 0.8800 0.8805
0.3511 1.5345 1200 0.2897 0.8817 0.8827
0.3511 1.7903 1400 0.2674 0.8856 0.8859
0.2582 2.0460 1600 0.2761 0.8855 0.8865
0.2582 2.3018 1800 0.2788 0.8860 0.8869
0.2582 2.5575 2000 0.2718 0.8848 0.8859
0.2582 2.8133 2200 0.2749 0.8910 0.8919
0.2136 3.0691 2400 0.2984 0.8900 0.8909
0.2136 3.3248 2600 0.2847 0.8902 0.8896
0.2136 3.5806 2800 0.2810 0.8900 0.8912
0.2136 3.8363 3000 0.3029 0.8869 0.8887
0.1805 4.0921 3200 0.2973 0.8935 0.8945
0.1805 4.3478 3400 0.3033 0.8938 0.8941
0.1805 4.6036 3600 0.2808 0.8927 0.8938
0.1805 4.8593 3800 0.3069 0.8923 0.8936
0.1529 5.1151 4000 0.3200 0.8877 0.8891
0.1529 5.3708 4200 0.3184 0.8958 0.8966
0.1529 5.6266 4400 0.3000 0.8917 0.8927
0.1529 5.8824 4600 0.3315 0.8956 0.8958
0.1295 6.1381 4800 0.3320 0.8965 0.8974
0.1295 6.3939 5000 0.3344 0.8975 0.8980
0.1295 6.6496 5200 0.3315 0.8952 0.8948
0.1295 6.9054 5400 0.3424 0.8950 0.8951
0.1123 7.1611 5600 0.3715 0.8918 0.8929
0.1123 7.4169 5800 0.3718 0.8959 0.8963
0.1123 7.6726 6000 0.3384 0.8959 0.8965
0.1123 7.9284 6200 0.3635 0.8907 0.8920
0.0958 8.1841 6400 0.3753 0.8969 0.8979
0.0958 8.4399 6600 0.4053 0.8968 0.8969
0.0958 8.6957 6800 0.3732 0.8968 0.8975
0.0958 8.9514 7000 0.4011 0.8986 0.8987
0.0816 9.2072 7200 0.4057 0.8975 0.8980
0.0816 9.4629 7400 0.4227 0.8945 0.8956
0.0816 9.7187 7600 0.4299 0.8977 0.8979
0.0816 9.9744 7800 0.4030 0.8979 0.8984
0.0715 10.2302 8000 0.4388 0.8973 0.8978
0.0715 10.4859 8200 0.4462 0.8969 0.8968
0.0715 10.7417 8400 0.4158 0.8974 0.8975
0.0635 10.9974 8600 0.4339 0.8977 0.8983
0.0635 11.2532 8800 0.4798 0.8994 0.8998
0.0635 11.5090 9000 0.4610 0.8957 0.8964
0.0635 11.7647 9200 0.4808 0.8940 0.8947
0.057 12.0205 9400 0.4701 0.8958 0.8955
0.057 12.2762 9600 0.4913 0.8945 0.8956
0.057 12.5320 9800 0.5026 0.8967 0.8973
0.057 12.7877 10000 0.4739 0.8969 0.8979
0.0507 13.0435 10200 0.4741 0.8966 0.8966
0.0507 13.2992 10400 0.4962 0.8995 0.8999
0.0507 13.5550 10600 0.5051 0.8969 0.8977
0.0507 13.8107 10800 0.4855 0.8970 0.8977
0.045 14.0665 11000 0.4995 0.8983 0.8990
0.045 14.3223 11200 0.5144 0.8972 0.8976
0.045 14.5780 11400 0.5057 0.8980 0.8984
0.045 14.8338 11600 0.5240 0.8995 0.9001
0.0422 15.0895 11800 0.5100 0.8991 0.8995
0.0422 15.3453 12000 0.5252 0.9000 0.9005
0.0422 15.6010 12200 0.5324 0.8981 0.8988
0.0422 15.8568 12400 0.5343 0.8997 0.9001
0.0372 16.1125 12600 0.5277 0.8990 0.8993
0.0372 16.3683 12800 0.5433 0.8992 0.8997
0.0372 16.6240 13000 0.5463 0.8986 0.8993
0.0372 16.8798 13200 0.5427 0.8980 0.8984
0.0338 17.1355 13400 0.5485 0.9001 0.9005
0.0338 17.3913 13600 0.5608 0.8966 0.8972
0.0338 17.6471 13800 0.5517 0.9000 0.9004
0.0338 17.9028 14000 0.5563 0.8997 0.9002
0.0315 18.1586 14200 0.5488 0.8995 0.8999
0.0315 18.4143 14400 0.5480 0.8992 0.8996
0.0315 18.6701 14600 0.5492 0.9002 0.9006
0.0315 18.9258 14800 0.5491 0.9014 0.9017
0.0298 19.1816 15000 0.5498 0.9005 0.9007
0.0298 19.4373 15200 0.5508 0.9007 0.9011
0.0298 19.6931 15400 0.5525 0.9002 0.9006
0.0298 19.9488 15600 0.5530 0.9002 0.9006

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.1.2
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for RonTon05/PhoBert_Lexical_Meta_Dataset59KBoDuoi

Finetuned
(187)
this model