Edit model card

ner-coin-v3

This model is a fine-tuned version of microsoft/Multilingual-MiniLM-L12-H384 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0147
  • Precision: 0.9941
  • Recall: 0.9931
  • F1: 0.9936
  • Accuracy: 0.9978

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 52 0.1916 0.9775 0.9777 0.9776 0.9935
No log 2.0 104 0.1443 0.9815 0.9823 0.9819 0.9948
No log 3.0 156 0.1142 0.9844 0.9844 0.9844 0.9953
No log 4.0 208 0.0938 0.9852 0.9834 0.9843 0.9952
No log 5.0 260 0.0764 0.9870 0.9870 0.9870 0.9959
No log 6.0 312 0.0649 0.9876 0.9874 0.9875 0.9962
No log 7.0 364 0.0552 0.9903 0.9893 0.9898 0.9968
No log 8.0 416 0.0500 0.9884 0.9876 0.988 0.9962
No log 9.0 468 0.0443 0.9882 0.9878 0.9880 0.9963
0.1249 10.0 520 0.0402 0.9882 0.9880 0.9881 0.9962
0.1249 11.0 572 0.0351 0.9901 0.9884 0.9893 0.9967
0.1249 12.0 624 0.0315 0.9909 0.9882 0.9896 0.9969
0.1249 13.0 676 0.0307 0.9897 0.9888 0.9893 0.9966
0.1249 14.0 728 0.0268 0.9897 0.9895 0.9896 0.9967
0.1249 15.0 780 0.0277 0.9897 0.9861 0.9879 0.9962
0.1249 16.0 832 0.0242 0.9916 0.9893 0.9904 0.9971
0.1249 17.0 884 0.0214 0.9920 0.9901 0.9910 0.9973
0.1249 18.0 936 0.0216 0.9920 0.9895 0.9907 0.9972
0.1249 19.0 988 0.0211 0.9922 0.9899 0.9910 0.9972
0.0243 20.0 1040 0.0183 0.9930 0.9912 0.9921 0.9975
0.0243 21.0 1092 0.0161 0.9922 0.9907 0.9915 0.9974
0.0243 22.0 1144 0.0169 0.9935 0.9914 0.9924 0.9975
0.0243 23.0 1196 0.0176 0.9914 0.9897 0.9905 0.9972
0.0243 24.0 1248 0.0160 0.9918 0.9905 0.9912 0.9973
0.0243 25.0 1300 0.0150 0.9928 0.9918 0.9923 0.9974
0.0243 26.0 1352 0.0143 0.9935 0.9918 0.9926 0.9977
0.0243 27.0 1404 0.0140 0.9920 0.9918 0.9919 0.9973
0.0243 28.0 1456 0.0152 0.9909 0.9905 0.9907 0.9970
0.0111 29.0 1508 0.0147 0.9916 0.9907 0.9912 0.9972
0.0111 30.0 1560 0.0146 0.9920 0.9912 0.9916 0.9973
0.0111 31.0 1612 0.0143 0.9912 0.9905 0.9908 0.9970
0.0111 32.0 1664 0.0139 0.9907 0.9905 0.9906 0.9970
0.0111 33.0 1716 0.0144 0.9912 0.9912 0.9912 0.9970
0.0111 34.0 1768 0.0137 0.9922 0.9916 0.9919 0.9973
0.0111 35.0 1820 0.0139 0.9937 0.9926 0.9932 0.9977
0.0111 36.0 1872 0.0146 0.9912 0.9914 0.9913 0.9970
0.0111 37.0 1924 0.0138 0.9928 0.9920 0.9924 0.9975
0.0111 38.0 1976 0.0125 0.9933 0.9928 0.9931 0.9977
0.0062 39.0 2028 0.0138 0.9922 0.9914 0.9918 0.9973
0.0062 40.0 2080 0.0131 0.9918 0.9910 0.9914 0.9972
0.0062 41.0 2132 0.0141 0.9914 0.9912 0.9913 0.9971
0.0062 42.0 2184 0.0137 0.9930 0.9922 0.9926 0.9975
0.0062 43.0 2236 0.0139 0.9920 0.9910 0.9915 0.9973
0.0062 44.0 2288 0.0146 0.9924 0.9920 0.9922 0.9974
0.0062 45.0 2340 0.0134 0.9933 0.9924 0.9928 0.9977
0.0062 46.0 2392 0.0149 0.9935 0.9924 0.9929 0.9977
0.0062 47.0 2444 0.0124 0.9933 0.9926 0.9929 0.9977
0.0062 48.0 2496 0.0125 0.9931 0.9924 0.9927 0.9976
0.0037 49.0 2548 0.0130 0.9916 0.9910 0.9913 0.9972
0.0037 50.0 2600 0.0129 0.9928 0.9924 0.9926 0.9975
0.0037 51.0 2652 0.0128 0.9935 0.9926 0.9931 0.9975
0.0037 52.0 2704 0.0136 0.9922 0.9918 0.9920 0.9973
0.0037 53.0 2756 0.0141 0.9926 0.9916 0.9921 0.9974
0.0037 54.0 2808 0.0135 0.9933 0.9926 0.9929 0.9978
0.0037 55.0 2860 0.0147 0.9935 0.9922 0.9928 0.9975
0.0037 56.0 2912 0.0142 0.9935 0.9926 0.9931 0.9977
0.0037 57.0 2964 0.0139 0.9931 0.9926 0.9928 0.9975
0.0027 58.0 3016 0.0136 0.9935 0.9928 0.9932 0.9977
0.0027 59.0 3068 0.0143 0.9935 0.9926 0.9931 0.9976
0.0027 60.0 3120 0.0141 0.9933 0.9926 0.9929 0.9975
0.0027 61.0 3172 0.0129 0.9939 0.9931 0.9935 0.9978
0.0027 62.0 3224 0.0137 0.9939 0.9931 0.9935 0.9978
0.0027 63.0 3276 0.0136 0.9935 0.9924 0.9929 0.9977
0.0027 64.0 3328 0.0141 0.9933 0.9926 0.9929 0.9977
0.0027 65.0 3380 0.0141 0.9939 0.9931 0.9935 0.9978
0.0027 66.0 3432 0.0137 0.9939 0.9928 0.9934 0.9978
0.0027 67.0 3484 0.0145 0.9924 0.9916 0.992 0.9973
0.0019 68.0 3536 0.0150 0.9937 0.9924 0.9931 0.9977
0.0019 69.0 3588 0.0152 0.9930 0.9920 0.9925 0.9975
0.0019 70.0 3640 0.0149 0.9926 0.9918 0.9922 0.9973
0.0019 71.0 3692 0.0143 0.9935 0.9924 0.9929 0.9976
0.0019 72.0 3744 0.0152 0.9937 0.9924 0.9931 0.9977
0.0019 73.0 3796 0.0149 0.9933 0.9922 0.9927 0.9975
0.0019 74.0 3848 0.0158 0.9935 0.9922 0.9928 0.9976
0.0019 75.0 3900 0.0153 0.9928 0.9918 0.9923 0.9975
0.0019 76.0 3952 0.0154 0.9928 0.9920 0.9924 0.9975
0.0015 77.0 4004 0.0145 0.9937 0.9928 0.9933 0.9977
0.0015 78.0 4056 0.0158 0.9928 0.9920 0.9924 0.9975
0.0015 79.0 4108 0.0161 0.9933 0.9922 0.9927 0.9975
0.0015 80.0 4160 0.0155 0.9935 0.9924 0.9929 0.9976
0.0015 81.0 4212 0.0157 0.9933 0.9922 0.9927 0.9975
0.0015 82.0 4264 0.0155 0.9933 0.9922 0.9927 0.9975
0.0015 83.0 4316 0.0144 0.9939 0.9931 0.9935 0.9978
0.0015 84.0 4368 0.0142 0.9939 0.9931 0.9935 0.9978
0.0015 85.0 4420 0.0145 0.9941 0.9931 0.9936 0.9978
0.0015 86.0 4472 0.0151 0.9935 0.9924 0.9929 0.9977
0.0013 87.0 4524 0.0143 0.9935 0.9924 0.9929 0.9977
0.0013 88.0 4576 0.0145 0.9933 0.9924 0.9928 0.9976
0.0013 89.0 4628 0.0142 0.9935 0.9926 0.9931 0.9976
0.0013 90.0 4680 0.0142 0.9941 0.9933 0.9937 0.9978
0.0013 91.0 4732 0.0145 0.9941 0.9931 0.9936 0.9978
0.0013 92.0 4784 0.0146 0.9941 0.9931 0.9936 0.9978
0.0013 93.0 4836 0.0146 0.9939 0.9931 0.9935 0.9978
0.0013 94.0 4888 0.0147 0.9941 0.9931 0.9936 0.9978
0.0013 95.0 4940 0.0148 0.9941 0.9931 0.9936 0.9978
0.0013 96.0 4992 0.0147 0.9941 0.9931 0.9936 0.9978
0.0011 97.0 5044 0.0147 0.9939 0.9931 0.9935 0.9978
0.0011 98.0 5096 0.0147 0.9939 0.9931 0.9935 0.9978
0.0011 99.0 5148 0.0147 0.9939 0.9931 0.9935 0.9978
0.0011 100.0 5200 0.0147 0.9941 0.9931 0.9936 0.9978

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.14.6
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
118M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for thanhdath/ner-coin-v3

Finetuned
(20)
this model