Edit model card

nerui-seq_bn-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0541
  • Location Precision: 0.9022
  • Location Recall: 0.9651
  • Location F1: 0.9326
  • Location Number: 86
  • Organization Precision: 0.9006
  • Organization Recall: 0.9157
  • Organization F1: 0.9081
  • Organization Number: 178
  • Person Precision: 0.9766
  • Person Recall: 0.9766
  • Person F1: 0.9766
  • Person Number: 128
  • Overall Precision: 0.9252
  • Overall Recall: 0.9464
  • Overall F1: 0.9357
  • Overall Accuracy: 0.9857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.834 1.0 96 0.5200 0.0 0.0 0.0 86 0.3333 0.0112 0.0217 178 0.0 0.0 0.0 128 0.25 0.0051 0.0100 0.8437
0.4501 2.0 192 0.3053 0.4524 0.2209 0.2969 86 0.3739 0.4831 0.4216 178 0.3445 0.5625 0.4273 128 0.3680 0.4515 0.4055 0.9042
0.309 3.0 288 0.2180 0.5714 0.5116 0.5399 86 0.6212 0.6910 0.6543 178 0.6646 0.8203 0.7343 128 0.6282 0.6939 0.6594 0.9425
0.2169 4.0 384 0.1387 0.6633 0.7558 0.7065 86 0.75 0.7416 0.7458 178 0.8696 0.9375 0.9023 128 0.7694 0.8087 0.7886 0.9625
0.145 5.0 480 0.1083 0.7228 0.8488 0.7807 86 0.7424 0.8258 0.7819 178 0.9179 0.9609 0.9389 128 0.7921 0.875 0.8315 0.9671
0.1191 6.0 576 0.0931 0.7048 0.8605 0.7749 86 0.8249 0.8202 0.8225 178 0.9318 0.9609 0.9462 128 0.8285 0.875 0.8511 0.9703
0.1063 7.0 672 0.0904 0.7130 0.8953 0.7938 86 0.7989 0.8483 0.8229 178 0.9265 0.9844 0.9545 128 0.8176 0.9031 0.8582 0.9698
0.0947 8.0 768 0.0734 0.7624 0.8953 0.8235 86 0.8242 0.8427 0.8333 178 0.9535 0.9609 0.9572 128 0.8495 0.8929 0.8706 0.9768
0.085 9.0 864 0.0692 0.7959 0.9070 0.8478 86 0.8449 0.8876 0.8658 178 0.9466 0.9688 0.9575 128 0.8654 0.9184 0.8911 0.9789
0.082 10.0 960 0.0669 0.7959 0.9070 0.8478 86 0.8579 0.8820 0.8698 178 0.9615 0.9766 0.9690 128 0.8759 0.9184 0.8966 0.9789
0.0769 11.0 1056 0.0632 0.8182 0.9419 0.8757 86 0.8548 0.8933 0.8736 178 0.9612 0.9688 0.9650 128 0.8792 0.9286 0.9032 0.9803
0.0749 12.0 1152 0.0609 0.8421 0.9302 0.8840 86 0.8717 0.9157 0.8932 178 0.9690 0.9766 0.9728 128 0.8954 0.9388 0.9166 0.9819
0.0648 13.0 1248 0.0569 0.8316 0.9186 0.8729 86 0.8736 0.8933 0.8833 178 0.9612 0.9688 0.9650 128 0.8916 0.9235 0.9073 0.9811
0.0639 14.0 1344 0.0598 0.8211 0.9070 0.8619 86 0.9012 0.8708 0.8857 178 0.9612 0.9688 0.9650 128 0.9015 0.9107 0.9061 0.9811
0.0627 15.0 1440 0.0537 0.8495 0.9186 0.8827 86 0.8757 0.9101 0.8926 178 0.9843 0.9766 0.9804 128 0.9037 0.9337 0.9184 0.9819
0.0567 16.0 1536 0.0549 0.8247 0.9302 0.8743 86 0.8656 0.9045 0.8846 178 0.9615 0.9766 0.9690 128 0.8862 0.9337 0.9093 0.9792
0.0546 17.0 1632 0.0511 0.8696 0.9302 0.8989 86 0.8757 0.9101 0.8926 178 0.9843 0.9766 0.9804 128 0.9084 0.9362 0.9221 0.9830
0.0527 18.0 1728 0.0529 0.8454 0.9535 0.8962 86 0.9029 0.8876 0.8952 178 0.9688 0.9688 0.9688 128 0.91 0.9286 0.9192 0.9827
0.0492 19.0 1824 0.0535 0.8602 0.9302 0.8939 86 0.8977 0.8876 0.8927 178 0.9688 0.9688 0.9688 128 0.9118 0.9235 0.9176 0.9822
0.0474 20.0 1920 0.0501 0.8617 0.9419 0.9000 86 0.9061 0.9213 0.9136 178 0.9843 0.9766 0.9804 128 0.9204 0.9439 0.9320 0.9827
0.0447 21.0 2016 0.0528 0.8660 0.9767 0.9180 86 0.9045 0.9045 0.9045 178 0.9843 0.9766 0.9804 128 0.9204 0.9439 0.9320 0.9835
0.0453 22.0 2112 0.0522 0.8137 0.9651 0.8830 86 0.9056 0.9157 0.9106 178 0.9843 0.9766 0.9804 128 0.9071 0.9464 0.9263 0.9819
0.039 23.0 2208 0.0494 0.8384 0.9651 0.8973 86 0.92 0.9045 0.9122 178 0.9766 0.9766 0.9766 128 0.9179 0.9413 0.9295 0.9835
0.0414 24.0 2304 0.0537 0.83 0.9651 0.8925 86 0.9143 0.8989 0.9065 178 0.9766 0.9766 0.9766 128 0.9132 0.9388 0.9258 0.9819
0.038 25.0 2400 0.0514 0.8632 0.9535 0.9061 86 0.9191 0.8933 0.9060 178 0.9843 0.9766 0.9804 128 0.9266 0.9337 0.9301 0.9838
0.038 26.0 2496 0.0505 0.875 0.9767 0.9231 86 0.9244 0.8933 0.9086 178 0.9766 0.9766 0.9766 128 0.9293 0.9388 0.9340 0.9833
0.037 27.0 2592 0.0508 0.8469 0.9651 0.9022 86 0.9195 0.8989 0.9091 178 0.9843 0.9766 0.9804 128 0.9223 0.9388 0.9305 0.9825
0.0375 28.0 2688 0.0482 0.875 0.9767 0.9231 86 0.9101 0.9101 0.9101 178 0.9843 0.9766 0.9804 128 0.9252 0.9464 0.9357 0.9841
0.0342 29.0 2784 0.0484 0.8830 0.9651 0.9222 86 0.9096 0.9045 0.9070 178 0.9766 0.9766 0.9766 128 0.9248 0.9413 0.9330 0.9833
0.0314 30.0 2880 0.0489 0.8557 0.9651 0.9071 86 0.8913 0.9213 0.9061 178 0.9767 0.9844 0.9805 128 0.9098 0.9515 0.9302 0.9841
0.0308 31.0 2976 0.0522 0.8660 0.9767 0.9180 86 0.9029 0.8876 0.8952 178 0.9766 0.9766 0.9766 128 0.9175 0.9362 0.9268 0.9825
0.0284 32.0 3072 0.0507 0.875 0.9767 0.9231 86 0.9143 0.8989 0.9065 178 0.9766 0.9766 0.9766 128 0.9248 0.9413 0.9330 0.9833
0.029 33.0 3168 0.0519 0.8485 0.9767 0.9081 86 0.9133 0.8876 0.9003 178 0.9766 0.9766 0.9766 128 0.9175 0.9362 0.9268 0.9819
0.0273 34.0 3264 0.0514 0.8660 0.9767 0.9180 86 0.92 0.9045 0.9122 178 0.9766 0.9766 0.9766 128 0.925 0.9439 0.9343 0.9827
0.0273 35.0 3360 0.0535 0.8660 0.9767 0.9180 86 0.9133 0.8876 0.9003 178 0.9766 0.9766 0.9766 128 0.9221 0.9362 0.9291 0.9827
0.0275 36.0 3456 0.0519 0.8384 0.9651 0.8973 86 0.8895 0.9045 0.8969 178 0.9615 0.9766 0.9690 128 0.9 0.9413 0.9202 0.9814
0.0244 37.0 3552 0.0511 0.8438 0.9419 0.8901 86 0.8989 0.8989 0.8989 178 0.9766 0.9766 0.9766 128 0.9104 0.9337 0.9219 0.9819
0.0272 38.0 3648 0.0563 0.8660 0.9767 0.9180 86 0.9029 0.8876 0.8952 178 0.9766 0.9766 0.9766 128 0.9175 0.9362 0.9268 0.9827
0.028 39.0 3744 0.0517 0.875 0.9767 0.9231 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9229 0.9464 0.9345 0.9835
0.0236 40.0 3840 0.0525 0.8723 0.9535 0.9111 86 0.9056 0.9157 0.9106 178 0.9766 0.9766 0.9766 128 0.9204 0.9439 0.9320 0.9833
0.0223 41.0 3936 0.0547 0.8901 0.9419 0.9153 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9295 0.9413 0.9354 0.9835
0.0228 42.0 4032 0.0505 0.8804 0.9419 0.9101 86 0.9011 0.9213 0.9111 178 0.9766 0.9766 0.9766 128 0.9204 0.9439 0.9320 0.9841
0.0233 43.0 4128 0.0556 0.8646 0.9651 0.9121 86 0.9061 0.9213 0.9136 178 0.9766 0.9766 0.9766 128 0.9185 0.9490 0.9335 0.9835
0.0219 44.0 4224 0.0529 0.8571 0.9767 0.9130 86 0.8865 0.9213 0.9036 178 0.9690 0.9766 0.9728 128 0.9053 0.9515 0.9279 0.9835
0.0194 45.0 4320 0.0572 0.8660 0.9767 0.9180 86 0.9195 0.8989 0.9091 178 0.9766 0.9766 0.9766 128 0.9248 0.9413 0.9330 0.9819
0.0212 46.0 4416 0.0550 0.8454 0.9535 0.8962 86 0.8919 0.9270 0.9091 178 0.9615 0.9766 0.9690 128 0.9029 0.9490 0.9254 0.9827
0.019 47.0 4512 0.0515 0.8511 0.9302 0.8889 86 0.8967 0.9270 0.9116 178 0.9690 0.9766 0.9728 128 0.9091 0.9439 0.9262 0.9843
0.0182 48.0 4608 0.0524 0.8632 0.9535 0.9061 86 0.9006 0.9157 0.9081 178 0.9766 0.9766 0.9766 128 0.9158 0.9439 0.9296 0.9841
0.0185 49.0 4704 0.0523 0.8830 0.9651 0.9222 86 0.9148 0.9045 0.9096 178 0.9766 0.9766 0.9766 128 0.9271 0.9413 0.9342 0.9843
0.0181 50.0 4800 0.0539 0.875 0.9767 0.9231 86 0.9 0.9101 0.9050 178 0.9766 0.9766 0.9766 128 0.9183 0.9464 0.9322 0.9838
0.0181 51.0 4896 0.0564 0.8925 0.9651 0.9274 86 0.9205 0.9101 0.9153 178 0.9766 0.9766 0.9766 128 0.9320 0.9439 0.9379 0.9841
0.0177 52.0 4992 0.0585 0.8830 0.9651 0.9222 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9345 0.9464 0.9404 0.9835
0.0168 53.0 5088 0.0541 0.8333 0.9302 0.8791 86 0.8877 0.9326 0.9096 178 0.9690 0.9766 0.9728 128 0.9005 0.9464 0.9229 0.9835
0.0172 54.0 5184 0.0514 0.8696 0.9302 0.8989 86 0.8930 0.9382 0.9151 178 0.9690 0.9766 0.9728 128 0.9118 0.9490 0.9300 0.9846
0.0165 55.0 5280 0.0504 0.9022 0.9651 0.9326 86 0.9111 0.9213 0.9162 178 0.9843 0.9766 0.9804 128 0.9323 0.9490 0.9406 0.9854
0.017 56.0 5376 0.0514 0.8830 0.9651 0.9222 86 0.9081 0.9438 0.9256 178 0.9766 0.9766 0.9766 128 0.9238 0.9592 0.9412 0.9852
0.0153 57.0 5472 0.0513 0.8737 0.9651 0.9171 86 0.8777 0.9270 0.9016 178 0.9766 0.9766 0.9766 128 0.9075 0.9515 0.9290 0.9865
0.0157 58.0 5568 0.0568 0.8936 0.9767 0.9333 86 0.9195 0.8989 0.9091 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9841
0.0139 59.0 5664 0.0530 0.9101 0.9419 0.9257 86 0.8944 0.9045 0.8994 178 0.9766 0.9766 0.9766 128 0.9244 0.9362 0.9303 0.9838
0.0156 60.0 5760 0.0518 0.8646 0.9651 0.9121 86 0.8804 0.9101 0.8950 178 0.9690 0.9766 0.9728 128 0.9046 0.9439 0.9238 0.9841
0.0134 61.0 5856 0.0491 0.9101 0.9419 0.9257 86 0.9027 0.9382 0.9201 178 0.9690 0.9766 0.9728 128 0.9256 0.9515 0.9384 0.9852
0.0128 62.0 5952 0.0513 0.8913 0.9535 0.9213 86 0.8770 0.9213 0.8986 178 0.9690 0.9766 0.9728 128 0.9093 0.9464 0.9275 0.9843
0.0126 63.0 6048 0.0512 0.9121 0.9651 0.9379 86 0.8919 0.9270 0.9091 178 0.9690 0.9766 0.9728 128 0.9210 0.9515 0.9360 0.9854
0.0134 64.0 6144 0.0520 0.9011 0.9535 0.9266 86 0.8956 0.9157 0.9056 178 0.9766 0.9766 0.9766 128 0.9227 0.9439 0.9332 0.9843
0.0128 65.0 6240 0.0494 0.9022 0.9651 0.9326 86 0.8783 0.9326 0.9046 178 0.9690 0.9766 0.9728 128 0.9122 0.9541 0.9327 0.9860
0.0119 66.0 6336 0.0511 0.8830 0.9651 0.9222 86 0.8811 0.9157 0.8981 178 0.9690 0.9766 0.9728 128 0.9093 0.9464 0.9275 0.9852
0.012 67.0 6432 0.0549 0.8830 0.9651 0.9222 86 0.92 0.9045 0.9122 178 0.9766 0.9766 0.9766 128 0.9295 0.9413 0.9354 0.9852
0.0124 68.0 6528 0.0523 0.8913 0.9535 0.9213 86 0.8865 0.9213 0.9036 178 0.9690 0.9766 0.9728 128 0.9138 0.9464 0.9298 0.9838
0.0139 69.0 6624 0.0550 0.8817 0.9535 0.9162 86 0.9096 0.9045 0.9070 178 0.9690 0.9766 0.9728 128 0.9223 0.9388 0.9305 0.9841
0.0121 70.0 6720 0.0528 0.9022 0.9651 0.9326 86 0.8871 0.9270 0.9066 178 0.9690 0.9766 0.9728 128 0.9165 0.9515 0.9337 0.9852
0.0113 71.0 6816 0.0500 0.8901 0.9419 0.9153 86 0.8730 0.9270 0.8992 178 0.9690 0.9766 0.9728 128 0.9071 0.9464 0.9263 0.9857
0.0114 72.0 6912 0.0560 0.8804 0.9419 0.9101 86 0.8811 0.9157 0.8981 178 0.9690 0.9766 0.9728 128 0.9089 0.9413 0.9248 0.9843
0.01 73.0 7008 0.0532 0.8737 0.9651 0.9171 86 0.8871 0.9270 0.9066 178 0.9690 0.9766 0.9728 128 0.9098 0.9515 0.9302 0.9857
0.0138 74.0 7104 0.0525 0.8925 0.9651 0.9274 86 0.8817 0.9213 0.9011 178 0.9690 0.9766 0.9728 128 0.9118 0.9490 0.9300 0.9852
0.0108 75.0 7200 0.0573 0.8646 0.9651 0.9121 86 0.9040 0.8989 0.9014 178 0.9690 0.9766 0.9728 128 0.9154 0.9388 0.9270 0.9827
0.0113 76.0 7296 0.0550 0.9022 0.9651 0.9326 86 0.8950 0.9101 0.9025 178 0.9690 0.9766 0.9728 128 0.9204 0.9439 0.9320 0.9849
0.0111 77.0 7392 0.0564 0.9121 0.9651 0.9379 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9345 0.9464 0.9404 0.9846
0.0114 78.0 7488 0.0565 0.8737 0.9651 0.9171 86 0.8950 0.9101 0.9025 178 0.9766 0.9766 0.9766 128 0.9158 0.9439 0.9296 0.9833
0.0101 79.0 7584 0.0547 0.9022 0.9651 0.9326 86 0.9111 0.9213 0.9162 178 0.9690 0.9766 0.9728 128 0.9277 0.9490 0.9382 0.9846
0.0108 80.0 7680 0.0527 0.9022 0.9651 0.9326 86 0.8865 0.9213 0.9036 178 0.9690 0.9766 0.9728 128 0.9163 0.9490 0.9323 0.9854
0.0108 81.0 7776 0.0524 0.8925 0.9651 0.9274 86 0.8877 0.9326 0.9096 178 0.9690 0.9766 0.9728 128 0.9144 0.9541 0.9338 0.9860
0.0108 82.0 7872 0.0553 0.8830 0.9651 0.9222 86 0.8901 0.9101 0.9000 178 0.9766 0.9766 0.9766 128 0.9158 0.9439 0.9296 0.9846
0.0109 83.0 7968 0.0529 0.8925 0.9651 0.9274 86 0.8907 0.9157 0.9030 178 0.9690 0.9766 0.9728 128 0.9160 0.9464 0.9310 0.9852
0.0095 84.0 8064 0.0545 0.8925 0.9651 0.9274 86 0.9 0.9101 0.9050 178 0.9766 0.9766 0.9766 128 0.9227 0.9439 0.9332 0.9846
0.0096 85.0 8160 0.0556 0.9022 0.9651 0.9326 86 0.9101 0.9101 0.9101 178 0.9766 0.9766 0.9766 128 0.9296 0.9439 0.9367 0.9841
0.0098 86.0 8256 0.0534 0.9022 0.9651 0.9326 86 0.9022 0.9326 0.9171 178 0.9690 0.9766 0.9728 128 0.9235 0.9541 0.9385 0.9854
0.0103 87.0 8352 0.0529 0.9121 0.9651 0.9379 86 0.9056 0.9157 0.9106 178 0.9690 0.9766 0.9728 128 0.9275 0.9464 0.9369 0.9854
0.0113 88.0 8448 0.0537 0.8925 0.9651 0.9274 86 0.9016 0.9270 0.9141 178 0.9766 0.9766 0.9766 128 0.9233 0.9515 0.9372 0.9857
0.0103 89.0 8544 0.0538 0.8913 0.9535 0.9213 86 0.8913 0.9213 0.9061 178 0.9690 0.9766 0.9728 128 0.9160 0.9464 0.9310 0.9852
0.0101 90.0 8640 0.0522 0.8913 0.9535 0.9213 86 0.8865 0.9213 0.9036 178 0.9690 0.9766 0.9728 128 0.9138 0.9464 0.9298 0.9849
0.0089 91.0 8736 0.0528 0.9022 0.9651 0.9326 86 0.8859 0.9157 0.9006 178 0.9690 0.9766 0.9728 128 0.9160 0.9464 0.9310 0.9852
0.0088 92.0 8832 0.0547 0.9022 0.9651 0.9326 86 0.8956 0.9157 0.9056 178 0.9690 0.9766 0.9728 128 0.9206 0.9464 0.9333 0.9849
0.0099 93.0 8928 0.0542 0.9022 0.9651 0.9326 86 0.8907 0.9157 0.9030 178 0.9690 0.9766 0.9728 128 0.9183 0.9464 0.9322 0.9849
0.0095 94.0 9024 0.0542 0.9022 0.9651 0.9326 86 0.8907 0.9157 0.9030 178 0.9690 0.9766 0.9728 128 0.9183 0.9464 0.9322 0.9849
0.0094 95.0 9120 0.0536 0.9022 0.9651 0.9326 86 0.8907 0.9157 0.9030 178 0.9766 0.9766 0.9766 128 0.9206 0.9464 0.9333 0.9857
0.0093 96.0 9216 0.0531 0.9022 0.9651 0.9326 86 0.8913 0.9213 0.9061 178 0.9766 0.9766 0.9766 128 0.9208 0.9490 0.9347 0.9862
0.009 97.0 9312 0.0536 0.9022 0.9651 0.9326 86 0.8865 0.9213 0.9036 178 0.9690 0.9766 0.9728 128 0.9163 0.9490 0.9323 0.9857
0.0099 98.0 9408 0.0538 0.9022 0.9651 0.9326 86 0.8956 0.9157 0.9056 178 0.9766 0.9766 0.9766 128 0.9229 0.9464 0.9345 0.9854
0.0089 99.0 9504 0.0541 0.9121 0.9651 0.9379 86 0.9006 0.9157 0.9081 178 0.9766 0.9766 0.9766 128 0.9275 0.9464 0.9369 0.9860
0.0107 100.0 9600 0.0541 0.9022 0.9651 0.9326 86 0.9006 0.9157 0.9081 178 0.9766 0.9766 0.9766 128 0.9252 0.9464 0.9357 0.9857

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-seq_bn-3

Finetuned
(367)
this model