Edit model card

nerui-pt-pl50-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0738
  • Location Precision: 0.9
  • Location Recall: 0.9419
  • Location F1: 0.9205
  • Location Number: 86
  • Organization Precision: 0.9480
  • Organization Recall: 0.9213
  • Organization F1: 0.9345
  • Organization Number: 178
  • Person Precision: 0.9766
  • Person Recall: 0.9766
  • Person F1: 0.9766
  • Person Number: 128
  • Overall Precision: 0.9463
  • Overall Recall: 0.9439
  • Overall F1: 0.9451
  • Overall Accuracy: 0.9868

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8567 1.0 96 0.4086 0.0 0.0 0.0 86 0.225 0.1011 0.1395 178 0.2581 0.0625 0.1006 128 0.2321 0.0663 0.1032 0.8507
0.3561 2.0 192 0.1959 0.5 0.5814 0.5376 86 0.6272 0.5955 0.6110 178 0.8129 0.8828 0.8464 128 0.6593 0.6862 0.6725 0.9444
0.1947 3.0 288 0.1030 0.7556 0.7907 0.7727 86 0.7801 0.8371 0.8076 178 0.9265 0.9844 0.9545 128 0.8225 0.875 0.8480 0.9676
0.1377 4.0 384 0.0873 0.7609 0.8140 0.7865 86 0.7869 0.8090 0.7978 178 0.9538 0.9688 0.9612 128 0.8346 0.8622 0.8482 0.9727
0.1135 5.0 480 0.0728 0.7629 0.8605 0.8087 86 0.8441 0.8820 0.8626 178 0.9538 0.9688 0.9612 128 0.8596 0.9056 0.8820 0.9754
0.0987 6.0 576 0.0659 0.8211 0.9070 0.8619 86 0.8771 0.8820 0.8796 178 0.9688 0.9688 0.9688 128 0.8930 0.9158 0.9043 0.9789
0.0878 7.0 672 0.0588 0.8462 0.8953 0.8701 86 0.8710 0.9101 0.8901 178 0.9612 0.9688 0.9650 128 0.8941 0.9260 0.9098 0.9808
0.08 8.0 768 0.0554 0.8495 0.9186 0.8827 86 0.8624 0.9157 0.8883 178 0.984 0.9609 0.9723 128 0.8968 0.9311 0.9136 0.9814
0.0735 9.0 864 0.0520 0.8387 0.9070 0.8715 86 0.9 0.9101 0.9050 178 0.9690 0.9766 0.9728 128 0.9080 0.9311 0.9194 0.9841
0.0645 10.0 960 0.0520 0.8632 0.9535 0.9061 86 0.9050 0.9101 0.9076 178 0.9766 0.9766 0.9766 128 0.9179 0.9413 0.9295 0.9835
0.0597 11.0 1056 0.0518 0.8646 0.9651 0.9121 86 0.9205 0.9101 0.9153 178 0.9762 0.9609 0.9685 128 0.9246 0.9388 0.9316 0.9849
0.056 12.0 1152 0.0464 0.9101 0.9419 0.9257 86 0.8919 0.9270 0.9091 178 0.9690 0.9766 0.9728 128 0.9206 0.9464 0.9333 0.9852
0.0532 13.0 1248 0.0497 0.9 0.9419 0.9205 86 0.9016 0.9270 0.9141 178 0.9841 0.9688 0.9764 128 0.9273 0.9439 0.9355 0.9835
0.0512 14.0 1344 0.0478 0.9070 0.9070 0.9070 86 0.9162 0.9213 0.9188 178 0.9612 0.9688 0.9650 128 0.9289 0.9337 0.9313 0.9846
0.0459 15.0 1440 0.0487 0.8889 0.9302 0.9091 86 0.9371 0.9213 0.9292 178 0.9685 0.9609 0.9647 128 0.9362 0.9362 0.9362 0.9868
0.0462 16.0 1536 0.0502 0.8454 0.9535 0.8962 86 0.9153 0.9101 0.9127 178 0.9764 0.9688 0.9725 128 0.9177 0.9388 0.9281 0.9854
0.044 17.0 1632 0.0525 0.8778 0.9186 0.8977 86 0.9171 0.9326 0.9248 178 0.9612 0.9688 0.9650 128 0.9225 0.9413 0.9318 0.9852
0.0424 18.0 1728 0.0555 0.8804 0.9419 0.9101 86 0.9056 0.9157 0.9106 178 0.9764 0.9688 0.9725 128 0.9223 0.9388 0.9305 0.9833
0.0409 19.0 1824 0.0491 0.8876 0.9186 0.9029 86 0.8950 0.9101 0.9025 178 0.9688 0.9688 0.9688 128 0.9171 0.9311 0.9241 0.9846
0.0373 20.0 1920 0.0617 0.7767 0.9302 0.8466 86 0.8920 0.8820 0.8870 178 0.9683 0.9531 0.9606 128 0.8864 0.9158 0.9009 0.9806
0.0367 21.0 2016 0.0535 0.8351 0.9419 0.8852 86 0.9112 0.8652 0.8876 178 0.9766 0.9766 0.9766 128 0.9137 0.9184 0.9160 0.9841
0.0351 22.0 2112 0.0571 0.81 0.9419 0.8710 86 0.9306 0.9045 0.9174 178 0.9688 0.9688 0.9688 128 0.9127 0.9337 0.9231 0.9830
0.0328 23.0 2208 0.0508 0.8710 0.9419 0.9050 86 0.9415 0.9045 0.9226 178 0.9766 0.9766 0.9766 128 0.9362 0.9362 0.9362 0.9849
0.0323 24.0 2304 0.0519 0.8617 0.9419 0.9000 86 0.9222 0.9326 0.9274 178 0.9766 0.9766 0.9766 128 0.9254 0.9490 0.9370 0.9846
0.0298 25.0 2400 0.0498 0.8989 0.9302 0.9143 86 0.9257 0.9101 0.9178 178 0.9766 0.9766 0.9766 128 0.9362 0.9362 0.9362 0.9854
0.0274 26.0 2496 0.0515 0.9302 0.9302 0.9302 86 0.9081 0.9438 0.9256 178 0.9688 0.9688 0.9688 128 0.9323 0.9490 0.9406 0.9857
0.0292 27.0 2592 0.0561 0.8901 0.9419 0.9153 86 0.9171 0.9326 0.9248 178 0.9690 0.9766 0.9728 128 0.9277 0.9490 0.9382 0.9846
0.0271 28.0 2688 0.0544 0.8989 0.9302 0.9143 86 0.9121 0.9326 0.9222 178 0.9690 0.9766 0.9728 128 0.9275 0.9464 0.9369 0.9849
0.0271 29.0 2784 0.0503 0.8889 0.9302 0.9091 86 0.9022 0.9326 0.9171 178 0.9766 0.9766 0.9766 128 0.9229 0.9464 0.9345 0.9862
0.025 30.0 2880 0.0566 0.8367 0.9535 0.8913 86 0.9281 0.8708 0.8986 178 0.9766 0.9766 0.9766 128 0.9211 0.9235 0.9223 0.9849
0.0244 31.0 2976 0.0534 0.8617 0.9419 0.9000 86 0.9191 0.8933 0.9060 178 0.9766 0.9766 0.9766 128 0.9241 0.9311 0.9276 0.9857
0.0241 32.0 3072 0.0611 0.82 0.9535 0.8817 86 0.9294 0.8876 0.9080 178 0.9688 0.9688 0.9688 128 0.9146 0.9286 0.9215 0.9841
0.0229 33.0 3168 0.0556 0.8265 0.9419 0.8804 86 0.9294 0.8876 0.9080 178 0.9685 0.9609 0.9647 128 0.9165 0.9235 0.9199 0.9843
0.0227 34.0 3264 0.0545 0.8696 0.9302 0.8989 86 0.9360 0.9045 0.9200 178 0.9764 0.9688 0.9725 128 0.9335 0.9311 0.9323 0.9860
0.0223 35.0 3360 0.0530 0.8817 0.9535 0.9162 86 0.9314 0.9157 0.9235 178 0.9764 0.9688 0.9725 128 0.9342 0.9413 0.9377 0.9868
0.022 36.0 3456 0.0579 0.8438 0.9419 0.8901 86 0.9314 0.9157 0.9235 178 0.9612 0.9688 0.9650 128 0.92 0.9388 0.9293 0.9857
0.0221 37.0 3552 0.0526 0.8989 0.9302 0.9143 86 0.9322 0.9270 0.9296 178 0.9766 0.9766 0.9766 128 0.9391 0.9439 0.9415 0.9865
0.0213 38.0 3648 0.0572 0.8804 0.9419 0.9101 86 0.92 0.9045 0.9122 178 0.9690 0.9766 0.9728 128 0.9268 0.9362 0.9315 0.9857
0.0198 39.0 3744 0.0481 0.9302 0.9302 0.9302 86 0.9032 0.9438 0.9231 178 0.9688 0.9688 0.9688 128 0.93 0.9490 0.9394 0.9873
0.02 40.0 3840 0.0521 0.8791 0.9302 0.9040 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9338 0.9362 0.9350 0.9868
0.0182 41.0 3936 0.0510 0.8723 0.9535 0.9111 86 0.9419 0.9101 0.9257 178 0.9764 0.9688 0.9725 128 0.9364 0.9388 0.9376 0.9876
0.0194 42.0 4032 0.0546 0.8791 0.9302 0.9040 86 0.9425 0.9213 0.9318 178 0.9764 0.9688 0.9725 128 0.9388 0.9388 0.9388 0.9860
0.0164 43.0 4128 0.0566 0.9 0.9419 0.9205 86 0.9419 0.9101 0.9257 178 0.9766 0.9766 0.9766 128 0.9436 0.9388 0.9412 0.9876
0.0182 44.0 4224 0.0642 0.8438 0.9419 0.8901 86 0.9345 0.8820 0.9075 178 0.9688 0.9688 0.9688 128 0.9235 0.9235 0.9235 0.9849
0.0171 45.0 4320 0.0596 0.8438 0.9419 0.8901 86 0.9422 0.9157 0.9288 178 0.9688 0.9688 0.9688 128 0.9270 0.9388 0.9328 0.9857
0.0171 46.0 4416 0.0552 0.8283 0.9535 0.8865 86 0.9405 0.8876 0.9133 178 0.9688 0.9688 0.9688 128 0.9215 0.9286 0.9250 0.9860
0.0171 47.0 4512 0.0555 0.8901 0.9419 0.9153 86 0.9375 0.9270 0.9322 178 0.984 0.9609 0.9723 128 0.9413 0.9413 0.9413 0.9879
0.0143 48.0 4608 0.0627 0.8710 0.9419 0.9050 86 0.9425 0.9213 0.9318 178 0.9762 0.9609 0.9685 128 0.9364 0.9388 0.9376 0.9868
0.016 49.0 4704 0.0568 0.8901 0.9419 0.9153 86 0.9486 0.9326 0.9405 178 0.9690 0.9766 0.9728 128 0.9418 0.9490 0.9454 0.9870
0.0133 50.0 4800 0.0609 0.8804 0.9419 0.9101 86 0.9310 0.9101 0.9205 178 0.9690 0.9766 0.9728 128 0.9316 0.9388 0.9352 0.9860
0.0149 51.0 4896 0.0576 0.8696 0.9302 0.8989 86 0.9419 0.9101 0.9257 178 0.9690 0.9766 0.9728 128 0.9338 0.9362 0.9350 0.9860
0.0146 52.0 4992 0.0545 0.9 0.9419 0.9205 86 0.9492 0.9438 0.9465 178 0.9690 0.9766 0.9728 128 0.9444 0.9541 0.9492 0.9879
0.0149 53.0 5088 0.0607 0.8989 0.9302 0.9143 86 0.9389 0.9494 0.9441 178 0.9609 0.9609 0.9609 128 0.9370 0.9490 0.9430 0.9868
0.0131 54.0 5184 0.0642 0.8901 0.9419 0.9153 86 0.9535 0.9213 0.9371 178 0.9690 0.9766 0.9728 128 0.9439 0.9439 0.9439 0.9862
0.013 55.0 5280 0.0613 0.8587 0.9186 0.8876 86 0.9364 0.9101 0.9231 178 0.9766 0.9766 0.9766 128 0.9313 0.9337 0.9325 0.9854
0.0138 56.0 5376 0.0549 0.8989 0.9302 0.9143 86 0.9270 0.9270 0.9270 178 0.9764 0.9688 0.9725 128 0.9365 0.9413 0.9389 0.9857
0.013 57.0 5472 0.0631 0.8889 0.9302 0.9091 86 0.9371 0.9213 0.9292 178 0.9688 0.9688 0.9688 128 0.9364 0.9388 0.9376 0.9860
0.0122 58.0 5568 0.0628 0.8913 0.9535 0.9213 86 0.9477 0.9157 0.9314 178 0.9690 0.9766 0.9728 128 0.9415 0.9439 0.9427 0.9870
0.012 59.0 5664 0.0651 0.9205 0.9419 0.9310 86 0.9432 0.9326 0.9379 178 0.9688 0.9688 0.9688 128 0.9464 0.9464 0.9464 0.9868
0.0122 60.0 5760 0.0651 0.8737 0.9651 0.9171 86 0.9464 0.8933 0.9191 178 0.9688 0.9688 0.9688 128 0.9361 0.9337 0.9349 0.9857
0.0115 61.0 5856 0.0647 0.9 0.9419 0.9205 86 0.9492 0.9438 0.9465 178 0.9690 0.9766 0.9728 128 0.9444 0.9541 0.9492 0.9876
0.0121 62.0 5952 0.0635 0.9101 0.9419 0.9257 86 0.9371 0.9213 0.9292 178 0.9690 0.9766 0.9728 128 0.9415 0.9439 0.9427 0.9865
0.011 63.0 6048 0.0638 0.9091 0.9302 0.9195 86 0.9486 0.9326 0.9405 178 0.9690 0.9766 0.9728 128 0.9464 0.9464 0.9464 0.9868
0.0113 64.0 6144 0.0583 0.8913 0.9535 0.9213 86 0.9368 0.9157 0.9261 178 0.9764 0.9688 0.9725 128 0.9389 0.9413 0.9401 0.9868
0.0107 65.0 6240 0.0674 0.9 0.9419 0.9205 86 0.9375 0.9270 0.9322 178 0.9688 0.9688 0.9688 128 0.9391 0.9439 0.9415 0.9857
0.0106 66.0 6336 0.0687 0.9011 0.9535 0.9266 86 0.9474 0.9101 0.9284 178 0.9690 0.9766 0.9728 128 0.9437 0.9413 0.9425 0.9857
0.0102 67.0 6432 0.0663 0.8901 0.9419 0.9153 86 0.9422 0.9157 0.9288 178 0.9766 0.9766 0.9766 128 0.9413 0.9413 0.9413 0.9860
0.0099 68.0 6528 0.0665 0.8989 0.9302 0.9143 86 0.9540 0.9326 0.9432 178 0.9690 0.9766 0.9728 128 0.9464 0.9464 0.9464 0.9865
0.0108 69.0 6624 0.0600 0.9195 0.9302 0.9249 86 0.9432 0.9326 0.9379 178 0.9766 0.9766 0.9766 128 0.9488 0.9464 0.9476 0.9881
0.0096 70.0 6720 0.0623 0.9101 0.9419 0.9257 86 0.9330 0.9382 0.9356 178 0.9685 0.9609 0.9647 128 0.9392 0.9464 0.9428 0.9876
0.0105 71.0 6816 0.0666 0.9011 0.9535 0.9266 86 0.9425 0.9213 0.9318 178 0.9690 0.9766 0.9728 128 0.9416 0.9464 0.9440 0.9868
0.0088 72.0 6912 0.0666 0.9111 0.9535 0.9318 86 0.9353 0.8933 0.9138 178 0.9766 0.9766 0.9766 128 0.9433 0.9337 0.9385 0.9868
0.0104 73.0 7008 0.0699 0.8989 0.9302 0.9143 86 0.9360 0.9045 0.9200 178 0.9690 0.9766 0.9728 128 0.9385 0.9337 0.9361 0.9857
0.0087 74.0 7104 0.0669 0.8989 0.9302 0.9143 86 0.9371 0.9213 0.9292 178 0.9690 0.9766 0.9728 128 0.9389 0.9413 0.9401 0.9865
0.0093 75.0 7200 0.0710 0.8989 0.9302 0.9143 86 0.9368 0.9157 0.9261 178 0.9690 0.9766 0.9728 128 0.9388 0.9388 0.9388 0.9862
0.0098 76.0 7296 0.0709 0.8989 0.9302 0.9143 86 0.9364 0.9101 0.9231 178 0.9690 0.9766 0.9728 128 0.9386 0.9362 0.9374 0.9862
0.0091 77.0 7392 0.0709 0.9011 0.9535 0.9266 86 0.9532 0.9157 0.9341 178 0.9690 0.9766 0.9728 128 0.9463 0.9439 0.9451 0.9868
0.0082 78.0 7488 0.0711 0.8989 0.9302 0.9143 86 0.9425 0.9213 0.9318 178 0.9766 0.9766 0.9766 128 0.9437 0.9413 0.9425 0.9868
0.0099 79.0 7584 0.0680 0.8989 0.9302 0.9143 86 0.9379 0.9326 0.9352 178 0.9690 0.9766 0.9728 128 0.9392 0.9464 0.9428 0.9868
0.008 80.0 7680 0.0717 0.9101 0.9419 0.9257 86 0.9368 0.9157 0.9261 178 0.9690 0.9766 0.9728 128 0.9413 0.9413 0.9413 0.9865
0.0094 81.0 7776 0.0729 0.9111 0.9535 0.9318 86 0.9483 0.9270 0.9375 178 0.9690 0.9766 0.9728 128 0.9466 0.9490 0.9478 0.9870
0.0081 82.0 7872 0.0714 0.8989 0.9302 0.9143 86 0.9314 0.9157 0.9235 178 0.9690 0.9766 0.9728 128 0.9364 0.9388 0.9376 0.9860
0.0087 83.0 7968 0.0696 0.9011 0.9535 0.9266 86 0.9368 0.9157 0.9261 178 0.9766 0.9766 0.9766 128 0.9415 0.9439 0.9427 0.9868
0.0072 84.0 8064 0.0713 0.9121 0.9651 0.9379 86 0.9425 0.9213 0.9318 178 0.9688 0.9688 0.9688 128 0.9440 0.9464 0.9452 0.9870
0.0072 85.0 8160 0.0726 0.9011 0.9535 0.9266 86 0.9422 0.9157 0.9288 178 0.9764 0.9688 0.9725 128 0.9437 0.9413 0.9425 0.9862
0.0069 86.0 8256 0.0700 0.9011 0.9535 0.9266 86 0.9310 0.9101 0.9205 178 0.9612 0.9688 0.9650 128 0.9340 0.9388 0.9364 0.9860
0.0075 87.0 8352 0.0742 0.9101 0.9419 0.9257 86 0.9310 0.9101 0.9205 178 0.9612 0.9688 0.9650 128 0.9362 0.9362 0.9362 0.9857
0.0081 88.0 8448 0.0733 0.8989 0.9302 0.9143 86 0.9375 0.9270 0.9322 178 0.9690 0.9766 0.9728 128 0.9391 0.9439 0.9415 0.9865
0.0076 89.0 8544 0.0715 0.9101 0.9419 0.9257 86 0.9371 0.9213 0.9292 178 0.9688 0.9688 0.9688 128 0.9413 0.9413 0.9413 0.9865
0.007 90.0 8640 0.0731 0.9101 0.9419 0.9257 86 0.9429 0.9270 0.9348 178 0.9688 0.9688 0.9688 128 0.9439 0.9439 0.9439 0.9868
0.0067 91.0 8736 0.0735 0.9 0.9419 0.9205 86 0.9425 0.9213 0.9318 178 0.9688 0.9688 0.9688 128 0.9413 0.9413 0.9413 0.9865
0.0063 92.0 8832 0.0729 0.8889 0.9302 0.9091 86 0.9371 0.9213 0.9292 178 0.9688 0.9688 0.9688 128 0.9364 0.9388 0.9376 0.9862
0.0071 93.0 8928 0.0748 0.9 0.9419 0.9205 86 0.9425 0.9213 0.9318 178 0.9688 0.9688 0.9688 128 0.9413 0.9413 0.9413 0.9865
0.0069 94.0 9024 0.0724 0.8889 0.9302 0.9091 86 0.9425 0.9213 0.9318 178 0.9688 0.9688 0.9688 128 0.9388 0.9388 0.9388 0.9865
0.0079 95.0 9120 0.0744 0.9 0.9419 0.9205 86 0.9535 0.9213 0.9371 178 0.9766 0.9766 0.9766 128 0.9487 0.9439 0.9463 0.9870
0.0066 96.0 9216 0.0743 0.9 0.9419 0.9205 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9463 0.9439 0.9451 0.9868
0.0077 97.0 9312 0.0732 0.9 0.9419 0.9205 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9463 0.9439 0.9451 0.9868
0.0063 98.0 9408 0.0736 0.9 0.9419 0.9205 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9463 0.9439 0.9451 0.9868
0.0068 99.0 9504 0.0737 0.9 0.9419 0.9205 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9463 0.9439 0.9451 0.9868
0.0077 100.0 9600 0.0738 0.9 0.9419 0.9205 86 0.9480 0.9213 0.9345 178 0.9766 0.9766 0.9766 128 0.9463 0.9439 0.9451 0.9868

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl50-3

Finetuned
(367)
this model