Edit model card

nerui-seq_bn-rf64-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0395
  • Location Precision: 0.89
  • Location Recall: 0.9570
  • Location F1: 0.9223
  • Location Number: 93
  • Organization Precision: 0.9157
  • Organization Recall: 0.9157
  • Organization F1: 0.9157
  • Organization Number: 166
  • Person Precision: 0.9583
  • Person Recall: 0.9718
  • Person F1: 0.9650
  • Person Number: 142
  • Overall Precision: 0.9244
  • Overall Recall: 0.9451
  • Overall F1: 0.9346
  • Overall Accuracy: 0.9871

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.0507 1.0 96 0.6532 0.0 0.0 0.0 93 0.0 0.0 0.0 166 0.0 0.0 0.0 142 0.0 0.0 0.0 0.8343
0.5943 2.0 192 0.4513 0.0 0.0 0.0 93 0.3333 0.0422 0.0749 166 0.3256 0.0986 0.1514 142 0.3182 0.0524 0.0899 0.8428
0.4265 3.0 288 0.3261 0.2889 0.1398 0.1884 93 0.3659 0.4518 0.4043 166 0.3472 0.5282 0.4190 142 0.3498 0.4065 0.3760 0.8955
0.3456 4.0 384 0.2801 0.3684 0.3011 0.3314 93 0.4170 0.5602 0.4781 166 0.3909 0.6056 0.4751 142 0.3988 0.5162 0.45 0.9155
0.3004 5.0 480 0.2395 0.4091 0.3871 0.3978 93 0.4957 0.6928 0.5779 166 0.5160 0.6831 0.5879 142 0.4882 0.6185 0.5457 0.9336
0.2668 6.0 576 0.2074 0.47 0.5054 0.4870 93 0.5674 0.7349 0.6404 166 0.6369 0.8028 0.7103 142 0.5729 0.7057 0.6324 0.9457
0.2298 7.0 672 0.1720 0.5048 0.5699 0.5354 93 0.6275 0.7711 0.6919 166 0.7041 0.8380 0.7653 142 0.6276 0.7481 0.6826 0.9561
0.1899 8.0 768 0.1367 0.5865 0.6559 0.6193 93 0.7016 0.8072 0.7507 166 0.8323 0.9085 0.8687 142 0.72 0.8080 0.7615 0.9641
0.1641 9.0 864 0.1154 0.6486 0.7742 0.7059 93 0.7705 0.8494 0.8080 166 0.8693 0.9366 0.9017 142 0.7740 0.8628 0.8160 0.9706
0.1436 10.0 960 0.0999 0.7525 0.8172 0.7835 93 0.7912 0.8675 0.8276 166 0.9128 0.9577 0.9347 142 0.8241 0.8878 0.8547 0.9753
0.1282 11.0 1056 0.0896 0.7596 0.8495 0.8020 93 0.7956 0.8675 0.8300 166 0.9128 0.9577 0.9347 142 0.8272 0.8953 0.8599 0.9759
0.1193 12.0 1152 0.0848 0.7570 0.8710 0.81 93 0.7923 0.8735 0.8309 166 0.9257 0.9648 0.9448 142 0.8288 0.9052 0.8653 0.9756
0.1109 13.0 1248 0.0766 0.7864 0.8710 0.8265 93 0.8268 0.8916 0.8580 166 0.9320 0.9648 0.9481 142 0.8531 0.9127 0.8819 0.9786
0.1039 14.0 1344 0.0730 0.7981 0.8925 0.8426 93 0.8111 0.8795 0.8439 166 0.9517 0.9718 0.9617 142 0.8555 0.9152 0.8843 0.9786
0.0974 15.0 1440 0.0715 0.8137 0.8925 0.8513 93 0.8132 0.8916 0.8506 166 0.9452 0.9718 0.9583 142 0.8581 0.9202 0.8881 0.9775
0.0954 16.0 1536 0.0712 0.7905 0.8925 0.8384 93 0.8142 0.8976 0.8539 166 0.9517 0.9718 0.9617 142 0.8545 0.9227 0.8873 0.9778
0.0911 17.0 1632 0.0644 0.8333 0.9140 0.8718 93 0.8305 0.8855 0.8571 166 0.9452 0.9718 0.9583 142 0.8706 0.9227 0.8959 0.9800
0.0868 18.0 1728 0.0609 0.84 0.9032 0.8705 93 0.8475 0.9036 0.8746 166 0.9583 0.9718 0.9650 142 0.8836 0.9277 0.9051 0.9816
0.082 19.0 1824 0.0594 0.8614 0.9355 0.8969 93 0.8613 0.8976 0.8791 166 0.9452 0.9718 0.9583 142 0.8905 0.9327 0.9111 0.9813
0.0785 20.0 1920 0.0584 0.86 0.9247 0.8912 93 0.8287 0.9036 0.8646 166 0.9583 0.9718 0.9650 142 0.88 0.9327 0.9056 0.9813
0.0804 21.0 2016 0.0587 0.8529 0.9355 0.8923 93 0.8380 0.9036 0.8696 166 0.9452 0.9718 0.9583 142 0.8782 0.9352 0.9058 0.9802
0.0751 22.0 2112 0.0569 0.8713 0.9462 0.9072 93 0.85 0.9217 0.8844 166 0.9517 0.9718 0.9617 142 0.8897 0.9451 0.9166 0.9822
0.0747 23.0 2208 0.0545 0.8462 0.9462 0.8934 93 0.8970 0.8916 0.8943 166 0.9583 0.9718 0.9650 142 0.9056 0.9327 0.9189 0.9830
0.0733 24.0 2304 0.0552 0.8286 0.9355 0.8788 93 0.8757 0.8916 0.8836 166 0.9452 0.9718 0.9583 142 0.8881 0.9302 0.9086 0.9811
0.0695 25.0 2400 0.0547 0.8381 0.9462 0.8889 93 0.8457 0.8916 0.8680 166 0.9452 0.9718 0.9583 142 0.8779 0.9327 0.9045 0.9816
0.0697 26.0 2496 0.0498 0.8614 0.9355 0.8969 93 0.8810 0.8916 0.8862 166 0.9517 0.9718 0.9617 142 0.9010 0.9302 0.9153 0.9838
0.064 27.0 2592 0.0487 0.8713 0.9462 0.9072 93 0.8629 0.9096 0.8856 166 0.9452 0.9718 0.9583 142 0.8934 0.9401 0.9162 0.9838
0.0663 28.0 2688 0.0501 0.87 0.9355 0.9016 93 0.8370 0.9277 0.8800 166 0.9583 0.9718 0.9650 142 0.8855 0.9451 0.9144 0.9833
0.064 29.0 2784 0.0482 0.8544 0.9462 0.8980 93 0.8671 0.9036 0.8850 166 0.9583 0.9718 0.9650 142 0.8952 0.9377 0.9160 0.9844
0.0606 30.0 2880 0.0477 0.8627 0.9462 0.9026 93 0.8596 0.9217 0.8895 166 0.9583 0.9718 0.9650 142 0.8939 0.9451 0.9188 0.9841
0.0616 31.0 2976 0.0475 0.88 0.9462 0.9119 93 0.8706 0.8916 0.8810 166 0.9583 0.9718 0.9650 142 0.9034 0.9327 0.9178 0.9855
0.0582 32.0 3072 0.0462 0.87 0.9355 0.9016 93 0.8655 0.8916 0.8783 166 0.9583 0.9718 0.9650 142 0.8988 0.9302 0.9142 0.9849
0.058 33.0 3168 0.0463 0.8529 0.9355 0.8923 93 0.8671 0.9036 0.8850 166 0.9583 0.9718 0.9650 142 0.8950 0.9352 0.9146 0.9844
0.0587 34.0 3264 0.0449 0.87 0.9355 0.9016 93 0.8539 0.9157 0.8837 166 0.9650 0.9718 0.9684 142 0.8955 0.9401 0.9173 0.9844
0.056 35.0 3360 0.0444 0.8614 0.9355 0.8969 93 0.8580 0.9096 0.8830 166 0.9650 0.9718 0.9684 142 0.8952 0.9377 0.9160 0.9844
0.0573 36.0 3456 0.0443 0.8614 0.9355 0.8969 93 0.8678 0.9096 0.8882 166 0.9650 0.9718 0.9684 142 0.8995 0.9377 0.9182 0.9846
0.0538 37.0 3552 0.0433 0.8687 0.9247 0.8958 93 0.8580 0.9096 0.8830 166 0.9650 0.9718 0.9684 142 0.8971 0.9352 0.9158 0.9855
0.0533 38.0 3648 0.0431 0.8687 0.9247 0.8958 93 0.8621 0.9036 0.8824 166 0.9650 0.9718 0.9684 142 0.8990 0.9327 0.9155 0.9849
0.0513 39.0 3744 0.0425 0.8687 0.9247 0.8958 93 0.8728 0.9096 0.8909 166 0.9650 0.9718 0.9684 142 0.9036 0.9352 0.9191 0.9860
0.0512 40.0 3840 0.0432 0.8673 0.9140 0.8901 93 0.8613 0.8976 0.8791 166 0.9583 0.9718 0.9650 142 0.8964 0.9277 0.9118 0.9849
0.051 41.0 3936 0.0419 0.8673 0.9140 0.8901 93 0.8678 0.9096 0.8882 166 0.9650 0.9718 0.9684 142 0.9012 0.9327 0.9167 0.9855
0.0504 42.0 4032 0.0419 0.8673 0.9140 0.8901 93 0.8629 0.9096 0.8856 166 0.9650 0.9718 0.9684 142 0.8990 0.9327 0.9155 0.9852
0.0467 43.0 4128 0.0420 0.8673 0.9140 0.8901 93 0.8728 0.9096 0.8909 166 0.9650 0.9718 0.9684 142 0.9034 0.9327 0.9178 0.9852
0.0495 44.0 4224 0.0428 0.8776 0.9247 0.9005 93 0.8621 0.9036 0.8824 166 0.9583 0.9718 0.9650 142 0.8990 0.9327 0.9155 0.9849
0.0481 45.0 4320 0.0413 0.8687 0.9247 0.8958 93 0.8779 0.9096 0.8935 166 0.9650 0.9718 0.9684 142 0.9058 0.9352 0.9202 0.9857
0.045 46.0 4416 0.0416 0.8788 0.9355 0.9062 93 0.8772 0.9036 0.8902 166 0.9583 0.9718 0.9650 142 0.9058 0.9352 0.9202 0.9860
0.0459 47.0 4512 0.0414 0.8776 0.9247 0.9005 93 0.8786 0.9157 0.8968 166 0.9650 0.9718 0.9684 142 0.9082 0.9377 0.9227 0.9868
0.0452 48.0 4608 0.0424 0.87 0.9355 0.9016 93 0.8810 0.8916 0.8862 166 0.9583 0.9718 0.9650 142 0.9053 0.9302 0.9176 0.9857
0.0451 49.0 4704 0.0408 0.8878 0.9355 0.9110 93 0.8895 0.9217 0.9053 166 0.9650 0.9718 0.9684 142 0.9153 0.9426 0.9287 0.9877
0.0432 50.0 4800 0.0413 0.8687 0.9247 0.8958 93 0.8837 0.9157 0.8994 166 0.9650 0.9718 0.9684 142 0.9082 0.9377 0.9227 0.9866
0.0454 51.0 4896 0.0417 0.87 0.9355 0.9016 93 0.8876 0.9036 0.8955 166 0.9583 0.9718 0.9650 142 0.9080 0.9352 0.9214 0.9866
0.044 52.0 4992 0.0413 0.8763 0.9140 0.8947 93 0.8736 0.9157 0.8941 166 0.9650 0.9718 0.9684 142 0.9058 0.9352 0.9202 0.9860
0.0438 53.0 5088 0.0425 0.8788 0.9355 0.9062 93 0.8837 0.9157 0.8994 166 0.9650 0.9718 0.9684 142 0.9106 0.9401 0.9252 0.9866
0.0426 54.0 5184 0.0411 0.8586 0.9140 0.8854 93 0.8728 0.9096 0.8909 166 0.9650 0.9718 0.9684 142 0.9012 0.9327 0.9167 0.9863
0.0423 55.0 5280 0.0408 0.8687 0.9247 0.8958 93 0.8713 0.8976 0.8843 166 0.9583 0.9718 0.9650 142 0.9010 0.9302 0.9153 0.9857
0.042 56.0 5376 0.0408 0.8776 0.9247 0.9005 93 0.8895 0.9217 0.9053 166 0.9650 0.9718 0.9684 142 0.9128 0.9401 0.9263 0.9871
0.0433 57.0 5472 0.0413 0.8614 0.9355 0.8969 93 0.8810 0.8916 0.8862 166 0.9583 0.9718 0.9650 142 0.9031 0.9302 0.9165 0.9855
0.0433 58.0 5568 0.0416 0.8878 0.9355 0.9110 93 0.8765 0.8976 0.8869 166 0.9583 0.9718 0.9650 142 0.9078 0.9327 0.9200 0.9857
0.0391 59.0 5664 0.0413 0.8614 0.9355 0.8969 93 0.8810 0.8916 0.8862 166 0.9583 0.9718 0.9650 142 0.9031 0.9302 0.9165 0.9855
0.0403 60.0 5760 0.0411 0.8627 0.9462 0.9026 93 0.8869 0.8976 0.8922 166 0.9583 0.9718 0.9650 142 0.9058 0.9352 0.9202 0.9857
0.0397 61.0 5856 0.0396 0.8866 0.9247 0.9053 93 0.8947 0.9217 0.9080 166 0.9650 0.9718 0.9684 142 0.9173 0.9401 0.9286 0.9866
0.0384 62.0 5952 0.0405 0.87 0.9355 0.9016 93 0.8882 0.9096 0.8988 166 0.9650 0.9718 0.9684 142 0.9104 0.9377 0.9238 0.9860
0.0388 63.0 6048 0.0406 0.8713 0.9462 0.9072 93 0.8929 0.9036 0.8982 166 0.9650 0.9718 0.9684 142 0.9126 0.9377 0.9250 0.9863
0.0372 64.0 6144 0.0397 0.8878 0.9355 0.9110 93 0.9 0.9217 0.9107 166 0.9650 0.9718 0.9684 142 0.9197 0.9426 0.9310 0.9871
0.0368 65.0 6240 0.0408 0.88 0.9462 0.9119 93 0.8941 0.9157 0.9048 166 0.9650 0.9718 0.9684 142 0.9153 0.9426 0.9287 0.9871
0.0385 66.0 6336 0.0399 0.8788 0.9355 0.9062 93 0.8994 0.9157 0.9075 166 0.9650 0.9718 0.9684 142 0.9173 0.9401 0.9286 0.9868
0.0366 67.0 6432 0.0407 0.8713 0.9462 0.9072 93 0.8988 0.9096 0.9042 166 0.9583 0.9718 0.9650 142 0.9128 0.9401 0.9263 0.9863
0.0358 68.0 6528 0.0397 0.8980 0.9462 0.9215 93 0.9162 0.9217 0.9189 166 0.9650 0.9718 0.9684 142 0.9289 0.9451 0.9370 0.9874
0.0366 69.0 6624 0.0398 0.9082 0.9570 0.9319 93 0.9096 0.9096 0.9096 166 0.9583 0.9718 0.9650 142 0.9265 0.9426 0.9345 0.9871
0.0362 70.0 6720 0.0396 0.8980 0.9462 0.9215 93 0.9048 0.9157 0.9102 166 0.9583 0.9718 0.9650 142 0.9220 0.9426 0.9322 0.9874
0.0353 71.0 6816 0.0400 0.9184 0.9677 0.9424 93 0.9172 0.9337 0.9254 166 0.9650 0.9718 0.9684 142 0.9341 0.9551 0.9445 0.9879
0.0358 72.0 6912 0.0397 0.89 0.9570 0.9223 93 0.9152 0.9096 0.9124 166 0.9583 0.9718 0.9650 142 0.9242 0.9426 0.9333 0.9868
0.0333 73.0 7008 0.0393 0.8990 0.9570 0.9271 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9267 0.9451 0.9358 0.9871
0.0338 74.0 7104 0.0393 0.8980 0.9462 0.9215 93 0.9096 0.9096 0.9096 166 0.9583 0.9718 0.9650 142 0.9240 0.9401 0.9320 0.9868
0.0368 75.0 7200 0.0398 0.9184 0.9677 0.9424 93 0.9162 0.9217 0.9189 166 0.9583 0.9718 0.9650 142 0.9315 0.9501 0.9407 0.9874
0.034 76.0 7296 0.0398 0.8990 0.9570 0.9271 93 0.9152 0.9096 0.9124 166 0.9583 0.9718 0.9650 142 0.9265 0.9426 0.9345 0.9874
0.0358 77.0 7392 0.0391 0.8990 0.9570 0.9271 93 0.9152 0.9096 0.9124 166 0.9583 0.9718 0.9650 142 0.9265 0.9426 0.9345 0.9874
0.0343 78.0 7488 0.0391 0.8889 0.9462 0.9167 93 0.9096 0.9096 0.9096 166 0.9583 0.9718 0.9650 142 0.9218 0.9401 0.9309 0.9868
0.0338 79.0 7584 0.0388 0.8980 0.9462 0.9215 93 0.9042 0.9096 0.9069 166 0.9583 0.9718 0.9650 142 0.9218 0.9401 0.9309 0.9874
0.0343 80.0 7680 0.0387 0.8980 0.9462 0.9215 93 0.9 0.9217 0.9107 166 0.9583 0.9718 0.9650 142 0.9199 0.9451 0.9323 0.9874
0.0345 81.0 7776 0.0397 0.9091 0.9677 0.9375 93 0.9162 0.9217 0.9189 166 0.9583 0.9718 0.9650 142 0.9293 0.9501 0.9396 0.9874
0.0332 82.0 7872 0.0405 0.9091 0.9677 0.9375 93 0.9096 0.9096 0.9096 166 0.9583 0.9718 0.9650 142 0.9267 0.9451 0.9358 0.9874
0.0341 83.0 7968 0.0401 0.8889 0.9462 0.9167 93 0.9091 0.9036 0.9063 166 0.9583 0.9718 0.9650 142 0.9216 0.9377 0.9295 0.9866
0.0323 84.0 8064 0.0402 0.9 0.9677 0.9326 93 0.9152 0.9096 0.9124 166 0.9583 0.9718 0.9650 142 0.9267 0.9451 0.9358 0.9871
0.0331 85.0 8160 0.0396 0.89 0.9570 0.9223 93 0.9152 0.9096 0.9124 166 0.9583 0.9718 0.9650 142 0.9242 0.9426 0.9333 0.9868
0.0337 86.0 8256 0.0395 0.8990 0.9570 0.9271 93 0.9091 0.9036 0.9063 166 0.9583 0.9718 0.9650 142 0.9240 0.9401 0.9320 0.9868
0.0336 87.0 8352 0.0393 0.8889 0.9462 0.9167 93 0.9096 0.9096 0.9096 166 0.9583 0.9718 0.9650 142 0.9218 0.9401 0.9309 0.9868
0.0331 88.0 8448 0.0395 0.89 0.9570 0.9223 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9244 0.9451 0.9346 0.9871
0.0336 89.0 8544 0.0396 0.8889 0.9462 0.9167 93 0.9096 0.9096 0.9096 166 0.9583 0.9718 0.9650 142 0.9218 0.9401 0.9309 0.9868
0.033 90.0 8640 0.0396 0.89 0.9570 0.9223 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9244 0.9451 0.9346 0.9871
0.033 91.0 8736 0.0395 0.89 0.9570 0.9223 93 0.9152 0.9096 0.9124 166 0.9583 0.9718 0.9650 142 0.9242 0.9426 0.9333 0.9871
0.0317 92.0 8832 0.0398 0.9091 0.9677 0.9375 93 0.9102 0.9157 0.9129 166 0.9650 0.9718 0.9684 142 0.9291 0.9476 0.9383 0.9871
0.0337 93.0 8928 0.0397 0.9 0.9677 0.9326 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9268 0.9476 0.9371 0.9874
0.0324 94.0 9024 0.0397 0.9 0.9677 0.9326 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9268 0.9476 0.9371 0.9874
0.033 95.0 9120 0.0395 0.89 0.9570 0.9223 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9244 0.9451 0.9346 0.9871
0.0309 96.0 9216 0.0394 0.89 0.9570 0.9223 93 0.9162 0.9217 0.9189 166 0.9650 0.9718 0.9684 142 0.9268 0.9476 0.9371 0.9874
0.0322 97.0 9312 0.0395 0.89 0.9570 0.9223 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9244 0.9451 0.9346 0.9871
0.033 98.0 9408 0.0396 0.9 0.9677 0.9326 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9268 0.9476 0.9371 0.9874
0.0318 99.0 9504 0.0395 0.89 0.9570 0.9223 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9244 0.9451 0.9346 0.9871
0.0327 100.0 9600 0.0395 0.89 0.9570 0.9223 93 0.9157 0.9157 0.9157 166 0.9583 0.9718 0.9650 142 0.9244 0.9451 0.9346 0.9871

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-seq_bn-rf64-2

Finetuned
(367)
this model