Porameht's picture
Update README.md
a3e3b05 verified
|
raw
history blame
6.53 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
base_model: google-bert/bert-base-multilingual-cased
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: bert-base-intent-classification-cs-th
    results: []
datasets:
  - Porameht/customer-support-th-26.9k
language:
  - th
library_name: transformers

bert-base-intent-classification-cs-th

This model is a fine-tuned version of google-bert/bert-base-multilingual-cased on an Porameht/customer-support-th-26.9k dataset.

🧠 Can understand if any customer wants to cancel an order from a sentence.

It achieves the following results on the evaluation set:

  • Loss: 0.0408
  • Accuracy: 0.9936
  • F1: 0.9936
  • Precision: 0.9937
  • Recall: 0.9936

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
3.2835 0.0595 50 3.1041 0.1203 0.0504 0.0632 0.1210
2.6752 0.1190 100 1.9646 0.5387 0.4737 0.6298 0.5426
1.4751 0.1786 150 0.9447 0.8190 0.7929 0.8271 0.8188
0.7571 0.2381 200 0.5163 0.8952 0.8826 0.8812 0.8955
0.4849 0.2976 250 0.3539 0.9003 0.8905 0.8926 0.9021
0.3401 0.3571 300 0.2883 0.9160 0.9037 0.9012 0.9165
0.2533 0.4167 350 0.1735 0.9431 0.9322 0.9266 0.9443
0.177 0.4762 400 0.1326 0.9665 0.9670 0.9676 0.9671
0.119 0.5357 450 0.1527 0.9592 0.9582 0.9699 0.9600
0.1183 0.5952 500 0.0886 0.9839 0.9841 0.9841 0.9842
0.1065 0.6548 550 0.0829 0.9844 0.9844 0.9847 0.9844
0.1006 0.7143 600 0.0686 0.9869 0.9869 0.9872 0.9869
0.1096 0.7738 650 0.1071 0.9789 0.9791 0.9800 0.9788
0.1392 0.8333 700 0.0939 0.9804 0.9804 0.9808 0.9803
0.1067 0.8929 750 0.1077 0.9786 0.9790 0.9802 0.9786
0.0779 0.9524 800 0.0657 0.9878 0.9878 0.9879 0.9879
0.0626 1.0119 850 0.0750 0.9851 0.9853 0.9856 0.9852
0.0419 1.0714 900 0.0641 0.9893 0.9893 0.9895 0.9893
0.0373 1.1310 950 0.0664 0.9891 0.9891 0.9893 0.9890
0.035 1.1905 1000 0.0575 0.9906 0.9906 0.9907 0.9906
0.036 1.25 1050 0.0601 0.9891 0.9893 0.9895 0.9892
0.0765 1.3095 1100 0.0682 0.9875 0.9875 0.9877 0.9874
0.0637 1.3690 1150 0.0587 0.9906 0.9906 0.9908 0.9906
0.0241 1.4286 1200 0.0528 0.9906 0.9907 0.9909 0.9905
0.0608 1.4881 1250 0.0458 0.9920 0.9920 0.9922 0.9919
0.0199 1.5476 1300 0.0508 0.9914 0.9914 0.9915 0.9914
0.0663 1.6071 1350 0.0461 0.9911 0.9910 0.9911 0.9910
0.0495 1.6667 1400 0.0525 0.9906 0.9907 0.9908 0.9906
0.0336 1.7262 1450 0.0478 0.9915 0.9916 0.9917 0.9915
0.0249 1.7857 1500 0.0578 0.9891 0.9891 0.9892 0.9891
0.0287 1.8452 1550 0.0547 0.9908 0.9908 0.9909 0.9908
0.0607 1.9048 1600 0.0395 0.9929 0.9929 0.9930 0.9928
0.0268 1.9643 1650 0.0529 0.9897 0.9898 0.9902 0.9897
0.013 2.0238 1700 0.0455 0.9924 0.9925 0.9926 0.9925
0.0106 2.0833 1750 0.0419 0.9927 0.9928 0.9928 0.9927
0.007 2.1429 1800 0.0461 0.9920 0.9920 0.9921 0.9919
0.0502 2.2024 1850 0.0433 0.9929 0.9929 0.9930 0.9929
0.017 2.2619 1900 0.0440 0.9926 0.9926 0.9927 0.9926
0.0119 2.3214 1950 0.0403 0.9927 0.9928 0.9928 0.9927
0.0063 2.3810 2000 0.0391 0.9930 0.9930 0.9931 0.9930
0.0103 2.4405 2050 0.0412 0.9929 0.9929 0.9930 0.9929
0.012 2.5 2100 0.0420 0.9929 0.9929 0.9930 0.9929
0.0233 2.5595 2150 0.0407 0.9927 0.9928 0.9928 0.9928
0.0169 2.6190 2200 0.0397 0.9930 0.9930 0.9931 0.9930
0.0281 2.6786 2250 0.0367 0.9933 0.9933 0.9934 0.9933
0.0117 2.7381 2300 0.0360 0.9933 0.9933 0.9934 0.9933
0.0225 2.7976 2350 0.0354 0.9936 0.9936 0.9937 0.9936
0.0078 2.8571 2400 0.0357 0.9936 0.9936 0.9937 0.9936
0.0164 2.9167 2450 0.0346 0.9939 0.9939 0.9940 0.9939
0.0016 2.9762 2500 0.0345 0.9939 0.9939 0.9940 0.9939

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1