Intent-classification-BERT-Large-Ashuv2
This model is a fine-tuned version of google-bert/bert-large-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7819
- Accuracy: 0.8571
- F1: 0.7838
- Precision: 0.7803
- Recall: 0.7898
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
1.4771 | 0.62 | 10 | 1.4650 | 0.5484 | 0.3724 | 0.3262 | 0.4815 |
1.1928 | 1.25 | 20 | 1.2691 | 0.5968 | 0.4620 | 0.4652 | 0.5370 |
0.9911 | 1.88 | 30 | 1.1678 | 0.6129 | 0.4794 | 0.4577 | 0.5556 |
0.7512 | 2.5 | 40 | 0.9525 | 0.6774 | 0.5424 | 0.4873 | 0.6296 |
0.7064 | 3.12 | 50 | 0.8495 | 0.6613 | 0.5319 | 0.4973 | 0.6111 |
0.5449 | 3.75 | 60 | 0.8052 | 0.6774 | 0.5744 | 0.6563 | 0.6349 |
0.4537 | 4.38 | 70 | 0.8058 | 0.7097 | 0.6281 | 0.6737 | 0.6772 |
0.398 | 5.0 | 80 | 0.5916 | 0.7581 | 0.7026 | 0.7035 | 0.7434 |
0.2933 | 5.62 | 90 | 0.8724 | 0.6935 | 0.6113 | 0.6623 | 0.6587 |
0.2834 | 6.25 | 100 | 0.6894 | 0.7419 | 0.7046 | 0.6973 | 0.7376 |
0.263 | 6.88 | 110 | 0.7285 | 0.7419 | 0.7244 | 0.7212 | 0.7556 |
0.181 | 7.5 | 120 | 0.6566 | 0.7419 | 0.7546 | 0.7617 | 0.7670 |
0.1736 | 8.12 | 130 | 1.0789 | 0.7903 | 0.7539 | 0.7372 | 0.7963 |
0.1837 | 8.75 | 140 | 0.8295 | 0.7419 | 0.7244 | 0.7212 | 0.7556 |
0.1696 | 9.38 | 150 | 1.1323 | 0.7581 | 0.7431 | 0.7313 | 0.7741 |
0.1758 | 10.0 | 160 | 0.8965 | 0.7258 | 0.7360 | 0.7516 | 0.7485 |
0.152 | 10.62 | 170 | 1.0633 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1169 | 11.25 | 180 | 1.1007 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1407 | 11.88 | 190 | 1.0659 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0788 | 12.5 | 200 | 1.2677 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.2394 | 13.12 | 210 | 0.8819 | 0.7419 | 0.7645 | 0.7639 | 0.7744 |
0.114 | 13.75 | 220 | 1.1865 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1454 | 14.38 | 230 | 1.3365 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1023 | 15.0 | 240 | 1.2334 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.132 | 15.62 | 250 | 1.3341 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1199 | 16.25 | 260 | 1.1251 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1161 | 16.88 | 270 | 1.2843 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0924 | 17.5 | 280 | 1.4196 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1167 | 18.12 | 290 | 1.2224 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1063 | 18.75 | 300 | 1.2558 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.1121 | 19.38 | 310 | 1.4312 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1198 | 20.0 | 320 | 1.4862 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1152 | 20.62 | 330 | 1.4057 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0827 | 21.25 | 340 | 1.4738 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1257 | 21.88 | 350 | 1.4706 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1021 | 22.5 | 360 | 1.3139 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1244 | 23.12 | 370 | 1.4685 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1173 | 23.75 | 380 | 1.5196 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0951 | 24.38 | 390 | 1.5036 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1069 | 25.0 | 400 | 1.5056 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1051 | 25.62 | 410 | 1.5297 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.1073 | 26.25 | 420 | 1.5805 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0913 | 26.88 | 430 | 1.6029 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0826 | 27.5 | 440 | 1.6013 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0926 | 28.12 | 450 | 1.5705 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0981 | 28.75 | 460 | 1.5954 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0823 | 29.38 | 470 | 1.6280 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1233 | 30.0 | 480 | 1.6143 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.098 | 30.62 | 490 | 1.5885 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.072 | 31.25 | 500 | 1.5868 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1248 | 31.88 | 510 | 1.6264 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1007 | 32.5 | 520 | 1.6531 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0829 | 33.12 | 530 | 1.6675 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0892 | 33.75 | 540 | 1.6814 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1048 | 34.38 | 550 | 1.6926 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1189 | 35.0 | 560 | 1.6922 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0904 | 35.62 | 570 | 1.6460 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.088 | 36.25 | 580 | 1.6609 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0902 | 36.88 | 590 | 1.7090 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1151 | 37.5 | 600 | 1.7120 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0665 | 38.12 | 610 | 1.7139 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1057 | 38.75 | 620 | 1.7650 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0926 | 39.38 | 630 | 1.7536 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1225 | 40.0 | 640 | 1.6866 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.073 | 40.62 | 650 | 1.5809 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1006 | 41.25 | 660 | 1.6110 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.096 | 41.88 | 670 | 1.6937 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0824 | 42.5 | 680 | 1.7297 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0803 | 43.12 | 690 | 1.7237 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1029 | 43.75 | 700 | 1.7103 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0923 | 44.38 | 710 | 1.7442 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0939 | 45.0 | 720 | 1.7685 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0894 | 45.62 | 730 | 1.7926 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0954 | 46.25 | 740 | 1.7750 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.0947 | 46.88 | 750 | 1.7498 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0621 | 47.5 | 760 | 1.7799 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1132 | 48.12 | 770 | 1.7738 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1054 | 48.75 | 780 | 1.7489 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0764 | 49.38 | 790 | 1.7737 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1055 | 50.0 | 800 | 1.7924 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0754 | 50.62 | 810 | 1.7958 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.112 | 51.25 | 820 | 1.7691 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.0937 | 51.88 | 830 | 1.7532 | 0.7581 | 0.7451 | 0.7394 | 0.7688 |
0.0865 | 52.5 | 840 | 1.7491 | 0.7581 | 0.7451 | 0.7394 | 0.7688 |
0.0942 | 53.12 | 850 | 1.7697 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0833 | 53.75 | 860 | 1.8022 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0979 | 54.38 | 870 | 1.8034 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0949 | 55.0 | 880 | 1.7938 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0836 | 55.62 | 890 | 1.7926 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0988 | 56.25 | 900 | 1.7862 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0872 | 56.88 | 910 | 1.7967 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0891 | 57.5 | 920 | 1.8087 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0836 | 58.12 | 930 | 1.8217 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.085 | 58.75 | 940 | 1.8281 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0917 | 59.38 | 950 | 1.8320 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.0931 | 60.0 | 960 | 1.8480 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.091 | 60.62 | 970 | 1.8438 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0782 | 61.25 | 980 | 1.8527 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1032 | 61.88 | 990 | 1.8643 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.1105 | 62.5 | 1000 | 1.8522 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0732 | 63.12 | 1010 | 1.8443 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0879 | 63.75 | 1020 | 1.8477 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0991 | 64.38 | 1030 | 1.8533 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0827 | 65.0 | 1040 | 1.8358 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0942 | 65.62 | 1050 | 1.8442 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0935 | 66.25 | 1060 | 1.8537 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0818 | 66.88 | 1070 | 1.8601 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0993 | 67.5 | 1080 | 1.8696 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1181 | 68.12 | 1090 | 1.8594 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1096 | 68.75 | 1100 | 1.8438 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0545 | 69.38 | 1110 | 1.8344 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0994 | 70.0 | 1120 | 1.8409 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.0905 | 70.62 | 1130 | 1.8529 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1115 | 71.25 | 1140 | 1.8463 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0775 | 71.88 | 1150 | 1.8440 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1055 | 72.5 | 1160 | 1.8457 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.074 | 73.12 | 1170 | 1.8525 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1023 | 73.75 | 1180 | 1.8586 | 0.7258 | 0.7333 | 0.7325 | 0.7466 |
0.1012 | 74.38 | 1190 | 1.8704 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0814 | 75.0 | 1200 | 1.8778 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0786 | 75.62 | 1210 | 1.8753 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0852 | 76.25 | 1220 | 1.8770 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.112 | 76.88 | 1230 | 1.8797 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0876 | 77.5 | 1240 | 1.8838 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0779 | 78.12 | 1250 | 1.8866 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0949 | 78.75 | 1260 | 1.8897 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0946 | 79.38 | 1270 | 1.8907 | 0.7581 | 0.7549 | 0.7397 | 0.7815 |
0.0812 | 80.0 | 1280 | 1.8892 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0844 | 80.62 | 1290 | 1.8903 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0977 | 81.25 | 1300 | 1.8894 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.0787 | 81.88 | 1310 | 1.8935 | 0.7742 | 0.7607 | 0.7431 | 0.7926 |
0.1164 | 82.5 | 1320 | 1.8920 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0752 | 83.12 | 1330 | 1.8886 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0898 | 83.75 | 1340 | 1.8896 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0983 | 84.38 | 1350 | 1.8847 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.095 | 85.0 | 1360 | 1.8840 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0727 | 85.62 | 1370 | 1.8853 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1182 | 86.25 | 1380 | 1.8857 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0681 | 86.88 | 1390 | 1.8829 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1079 | 87.5 | 1400 | 1.8880 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0897 | 88.12 | 1410 | 1.8882 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0675 | 88.75 | 1420 | 1.8889 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1091 | 89.38 | 1430 | 1.8894 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0831 | 90.0 | 1440 | 1.8917 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0815 | 90.62 | 1450 | 1.8949 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0903 | 91.25 | 1460 | 1.8959 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0937 | 91.88 | 1470 | 1.9001 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0797 | 92.5 | 1480 | 1.9006 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1141 | 93.12 | 1490 | 1.9017 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0696 | 93.75 | 1500 | 1.9018 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0979 | 94.38 | 1510 | 1.9038 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0846 | 95.0 | 1520 | 1.9055 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.078 | 95.62 | 1530 | 1.9060 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0947 | 96.25 | 1540 | 1.9067 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0823 | 96.88 | 1550 | 1.9081 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1367 | 97.5 | 1560 | 1.9081 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0597 | 98.12 | 1570 | 1.9085 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.1036 | 98.75 | 1580 | 1.9086 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0826 | 99.38 | 1590 | 1.9089 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
0.0917 | 100.0 | 1600 | 1.9090 | 0.7419 | 0.7487 | 0.7361 | 0.7704 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.2
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Narkantak/Intent-classification-BERT-Large-Ashuv2
Base model
google-bert/bert-large-uncased