Edit model card

nerui-pt-pl10-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0714
  • Location Precision: 0.91
  • Location Recall: 0.9681
  • Location F1: 0.9381
  • Location Number: 94
  • Organization Precision: 0.9255
  • Organization Recall: 0.8922
  • Organization F1: 0.9085
  • Organization Number: 167
  • Person Precision: 0.9781
  • Person Recall: 0.9781
  • Person F1: 0.9781
  • Person Number: 137
  • Overall Precision: 0.9397
  • Overall Recall: 0.9397
  • Overall F1: 0.9397
  • Overall Accuracy: 0.9876

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.847 1.0 96 0.4058 0.1667 0.0106 0.02 94 0.2152 0.3054 0.2525 167 0.2315 0.3650 0.2833 137 0.2222 0.2563 0.2380 0.8622
0.3603 2.0 192 0.2120 0.4038 0.4468 0.4242 94 0.5450 0.6886 0.6085 167 0.8038 0.9270 0.8610 137 0.6004 0.7136 0.6521 0.9398
0.1884 3.0 288 0.0928 0.7938 0.8191 0.8063 94 0.7765 0.8323 0.8035 167 0.9571 0.9781 0.9675 137 0.8413 0.8794 0.8600 0.9707
0.1293 4.0 384 0.0661 0.8367 0.8723 0.8542 94 0.8448 0.8802 0.8622 167 0.9853 0.9781 0.9817 137 0.8897 0.9121 0.9007 0.9796
0.1088 5.0 480 0.0556 0.8660 0.8936 0.8796 94 0.8538 0.8743 0.8639 167 0.9926 0.9854 0.9890 137 0.9035 0.9171 0.9102 0.9820
0.0895 6.0 576 0.0537 0.7965 0.9574 0.8696 94 0.8772 0.8982 0.8876 167 1.0 0.9854 0.9926 137 0.8950 0.9422 0.9180 0.9831
0.0849 7.0 672 0.0548 0.8037 0.9149 0.8557 94 0.8882 0.9042 0.8961 167 0.9925 0.9708 0.9815 137 0.9002 0.9296 0.9147 0.9812
0.0802 8.0 768 0.0499 0.8165 0.9468 0.8768 94 0.8563 0.8563 0.8563 167 1.0 0.9854 0.9926 137 0.8929 0.9221 0.9073 0.9829
0.0711 9.0 864 0.0462 0.8396 0.9468 0.89 94 0.8982 0.8982 0.8982 167 1.0 0.9854 0.9926 137 0.9167 0.9397 0.9280 0.9848
0.0656 10.0 960 0.0529 0.8182 0.9574 0.8824 94 0.8728 0.9042 0.8882 167 0.9781 0.9781 0.9781 137 0.8929 0.9422 0.9169 0.9826
0.0637 11.0 1056 0.0466 0.8257 0.9574 0.8867 94 0.8957 0.8743 0.8848 167 0.9926 0.9781 0.9853 137 0.9091 0.9296 0.9193 0.9843
0.0544 12.0 1152 0.0485 0.8241 0.9468 0.8812 94 0.9193 0.8862 0.9024 167 0.9853 0.9781 0.9817 137 0.9160 0.9322 0.9240 0.9845
0.0521 13.0 1248 0.0439 0.8476 0.9468 0.8945 94 0.9325 0.9102 0.9212 167 0.9926 0.9781 0.9853 137 0.9305 0.9422 0.9363 0.9870
0.0535 14.0 1344 0.0429 0.8529 0.9255 0.8878 94 0.8902 0.9222 0.9059 167 0.9926 0.9854 0.9890 137 0.9148 0.9447 0.9295 0.9856
0.0502 15.0 1440 0.0447 0.8889 0.9362 0.9119 94 0.9264 0.9042 0.9152 167 0.9853 0.9781 0.9817 137 0.9372 0.9372 0.9372 0.9876
0.0473 16.0 1536 0.0394 0.8980 0.9362 0.9167 94 0.9157 0.9102 0.9129 167 1.0 0.9854 0.9926 137 0.9398 0.9422 0.9410 0.9881
0.0459 17.0 1632 0.0450 0.8411 0.9574 0.8955 94 0.9207 0.9042 0.9124 167 1.0 0.9854 0.9926 137 0.9261 0.9447 0.9353 0.9848
0.046 18.0 1728 0.0511 0.8571 0.9574 0.9045 94 0.9434 0.8982 0.9202 167 0.9781 0.9781 0.9781 137 0.9327 0.9397 0.9362 0.9862
0.042 19.0 1824 0.0516 0.8654 0.9574 0.9091 94 0.9136 0.8862 0.8997 167 0.9781 0.9781 0.9781 137 0.9231 0.9347 0.9288 0.9851
0.0423 20.0 1920 0.0431 0.8725 0.9468 0.9082 94 0.9321 0.9042 0.9179 167 0.9926 0.9854 0.9890 137 0.9375 0.9422 0.9398 0.9867
0.0406 21.0 2016 0.0439 0.8846 0.9787 0.9293 94 0.9212 0.9102 0.9157 167 1.0 0.9854 0.9926 137 0.9381 0.9523 0.9451 0.9867
0.0362 22.0 2112 0.0427 0.8725 0.9468 0.9082 94 0.9030 0.8922 0.8976 167 1.0 0.9854 0.9926 137 0.9279 0.9372 0.9325 0.9867
0.0356 23.0 2208 0.0527 0.8824 0.9574 0.9184 94 0.9313 0.8922 0.9113 167 0.9781 0.9781 0.9781 137 0.9348 0.9372 0.9360 0.9878
0.0334 24.0 2304 0.0499 0.9 0.9574 0.9278 94 0.9563 0.9162 0.9358 167 0.9853 0.9781 0.9817 137 0.9520 0.9472 0.9496 0.9884
0.0357 25.0 2400 0.0487 0.89 0.9468 0.9175 94 0.9030 0.8922 0.8976 167 0.9853 0.9781 0.9817 137 0.9277 0.9347 0.9312 0.9851
0.035 26.0 2496 0.0482 0.8889 0.9362 0.9119 94 0.9096 0.9042 0.9069 167 0.9640 0.9781 0.9710 137 0.9233 0.9372 0.9302 0.9859
0.0322 27.0 2592 0.0473 0.8980 0.9362 0.9167 94 0.9212 0.9102 0.9157 167 0.9853 0.9781 0.9817 137 0.9373 0.9397 0.9385 0.9870
0.0313 28.0 2688 0.0500 0.89 0.9468 0.9175 94 0.9379 0.9042 0.9207 167 0.9781 0.9781 0.9781 137 0.9397 0.9397 0.9397 0.9873
0.029 29.0 2784 0.0520 0.8922 0.9681 0.9286 94 0.9560 0.9102 0.9325 167 0.9926 0.9854 0.9890 137 0.9521 0.9497 0.9509 0.9878
0.0282 30.0 2880 0.0583 0.8812 0.9468 0.9128 94 0.8947 0.9162 0.9053 167 0.9854 0.9854 0.9854 137 0.9218 0.9472 0.9343 0.9848
0.0298 31.0 2976 0.0511 0.8990 0.9468 0.9223 94 0.9367 0.8862 0.9108 167 0.9926 0.9854 0.9890 137 0.9466 0.9347 0.9406 0.9870
0.0288 32.0 3072 0.0567 0.8762 0.9787 0.9246 94 0.9375 0.8982 0.9174 167 0.9781 0.9781 0.9781 137 0.9353 0.9447 0.94 0.9862
0.0266 33.0 3168 0.0538 0.8980 0.9362 0.9167 94 0.9317 0.8982 0.9146 167 0.9781 0.9781 0.9781 137 0.9394 0.9347 0.9370 0.9876
0.0257 34.0 3264 0.0523 0.9271 0.9468 0.9368 94 0.9012 0.9281 0.9145 167 0.9853 0.9781 0.9817 137 0.9356 0.9497 0.9426 0.9873
0.026 35.0 3360 0.0527 0.8911 0.9574 0.9231 94 0.9068 0.8743 0.8902 167 0.9853 0.9781 0.9817 137 0.9296 0.9296 0.9296 0.9856
0.0256 36.0 3456 0.0504 0.89 0.9468 0.9175 94 0.9329 0.9162 0.9245 167 1.0 0.9854 0.9926 137 0.9449 0.9472 0.9460 0.9881
0.023 37.0 3552 0.0554 0.89 0.9468 0.9175 94 0.9277 0.9222 0.9249 167 0.9710 0.9781 0.9745 137 0.9332 0.9472 0.9401 0.9870
0.0225 38.0 3648 0.0492 0.9091 0.9574 0.9326 94 0.9394 0.9281 0.9337 167 0.9854 0.9854 0.9854 137 0.9476 0.9548 0.9512 0.9884
0.0229 39.0 3744 0.0517 0.8738 0.9574 0.9137 94 0.9255 0.8922 0.9085 167 0.9926 0.9781 0.9853 137 0.9348 0.9372 0.9360 0.9873
0.0207 40.0 3840 0.0522 0.8812 0.9468 0.9128 94 0.9162 0.9162 0.9162 167 0.9781 0.9781 0.9781 137 0.9284 0.9447 0.9365 0.9867
0.0235 41.0 3936 0.0560 0.875 0.9681 0.9192 94 0.9042 0.9042 0.9042 167 0.9854 0.9854 0.9854 137 0.9240 0.9472 0.9355 0.9862
0.0199 42.0 4032 0.0530 0.8922 0.9681 0.9286 94 0.9375 0.8982 0.9174 167 0.9926 0.9854 0.9890 137 0.9447 0.9447 0.9447 0.9881
0.0204 43.0 4128 0.0575 0.89 0.9468 0.9175 94 0.9 0.9162 0.9080 167 0.9926 0.9854 0.9890 137 0.9286 0.9472 0.9378 0.9859
0.0201 44.0 4224 0.0620 0.8585 0.9681 0.91 94 0.9231 0.8623 0.8916 167 0.9926 0.9781 0.9853 137 0.9295 0.9271 0.9283 0.9856
0.0202 45.0 4320 0.0497 0.9091 0.9574 0.9326 94 0.9152 0.9042 0.9096 167 0.9854 0.9854 0.9854 137 0.9377 0.9447 0.9412 0.9873
0.0199 46.0 4416 0.0473 0.8990 0.9468 0.9223 94 0.9268 0.9102 0.9184 167 0.9854 0.9854 0.9854 137 0.94 0.9447 0.9424 0.9881
0.0194 47.0 4512 0.0551 0.8679 0.9787 0.9200 94 0.9207 0.9042 0.9124 167 0.9926 0.9854 0.9890 137 0.9310 0.9497 0.9403 0.9862
0.0177 48.0 4608 0.0576 0.9010 0.9681 0.9333 94 0.9202 0.8982 0.9091 167 0.9854 0.9854 0.9854 137 0.9377 0.9447 0.9412 0.9867
0.0171 49.0 4704 0.0667 0.875 0.9681 0.9192 94 0.9255 0.8922 0.9085 167 0.9854 0.9854 0.9854 137 0.9328 0.9422 0.9375 0.9859
0.0197 50.0 4800 0.0641 0.8846 0.9787 0.9293 94 0.9198 0.8922 0.9058 167 0.9854 0.9854 0.9854 137 0.9330 0.9447 0.9388 0.9851
0.0166 51.0 4896 0.0642 0.875 0.9681 0.9192 94 0.9313 0.8922 0.9113 167 0.9854 0.9854 0.9854 137 0.9352 0.9422 0.9387 0.9870
0.0178 52.0 4992 0.0590 0.8969 0.9255 0.9110 94 0.9048 0.9102 0.9075 167 0.9781 0.9781 0.9781 137 0.9279 0.9372 0.9325 0.9867
0.0175 53.0 5088 0.0673 0.8835 0.9681 0.9239 94 0.9308 0.8862 0.9080 167 0.9781 0.9781 0.9781 137 0.9348 0.9372 0.9360 0.9870
0.0161 54.0 5184 0.0588 0.8738 0.9574 0.9137 94 0.9136 0.8862 0.8997 167 0.9854 0.9854 0.9854 137 0.9279 0.9372 0.9325 0.9867
0.0155 55.0 5280 0.0625 0.8911 0.9574 0.9231 94 0.8916 0.8862 0.8889 167 0.9854 0.9854 0.9854 137 0.9233 0.9372 0.9302 0.9865
0.017 56.0 5376 0.0595 0.8725 0.9468 0.9082 94 0.9091 0.8982 0.9036 167 0.9926 0.9854 0.9890 137 0.9280 0.9397 0.9338 0.9865
0.016 57.0 5472 0.0590 0.8835 0.9681 0.9239 94 0.9379 0.9042 0.9207 167 0.9853 0.9781 0.9817 137 0.94 0.9447 0.9424 0.9881
0.0148 58.0 5568 0.0595 0.9082 0.9468 0.9271 94 0.9379 0.9042 0.9207 167 0.9781 0.9781 0.9781 137 0.9444 0.9397 0.9421 0.9881
0.0148 59.0 5664 0.0575 0.9184 0.9574 0.9375 94 0.9268 0.9102 0.9184 167 0.9781 0.9781 0.9781 137 0.9424 0.9447 0.9435 0.9881
0.0121 60.0 5760 0.0620 0.8969 0.9255 0.9110 94 0.9167 0.9222 0.9194 167 0.9781 0.9781 0.9781 137 0.9328 0.9422 0.9375 0.9865
0.0159 61.0 5856 0.0629 0.9072 0.9362 0.9215 94 0.9162 0.9162 0.9162 167 0.9781 0.9781 0.9781 137 0.9352 0.9422 0.9387 0.9870
0.0155 62.0 5952 0.0692 0.9010 0.9681 0.9333 94 0.9317 0.8982 0.9146 167 0.9781 0.9781 0.9781 137 0.9398 0.9422 0.9410 0.9870
0.0143 63.0 6048 0.0656 0.8776 0.9149 0.8958 94 0.9048 0.9102 0.9075 167 0.9781 0.9781 0.9781 137 0.9231 0.9347 0.9288 0.9859
0.0137 64.0 6144 0.0657 0.9278 0.9574 0.9424 94 0.9259 0.8982 0.9119 167 0.9781 0.9781 0.9781 137 0.9444 0.9397 0.9421 0.9870
0.0135 65.0 6240 0.0683 0.8969 0.9255 0.9110 94 0.9268 0.9102 0.9184 167 0.9781 0.9781 0.9781 137 0.9372 0.9372 0.9372 0.9859
0.0128 66.0 6336 0.0657 0.91 0.9681 0.9381 94 0.9259 0.8982 0.9119 167 0.9708 0.9708 0.9708 137 0.9373 0.9397 0.9385 0.9876
0.0142 67.0 6432 0.0651 0.9278 0.9574 0.9424 94 0.9329 0.9162 0.9245 167 0.9781 0.9781 0.9781 137 0.9472 0.9472 0.9472 0.9881
0.0127 68.0 6528 0.0668 0.9062 0.9255 0.9158 94 0.9157 0.9102 0.9129 167 0.9853 0.9781 0.9817 137 0.9372 0.9372 0.9372 0.9867
0.0129 69.0 6624 0.0631 0.9010 0.9681 0.9333 94 0.925 0.8862 0.9052 167 0.9779 0.9708 0.9744 137 0.9370 0.9347 0.9358 0.9878
0.0125 70.0 6720 0.0636 0.89 0.9468 0.9175 94 0.9259 0.8982 0.9119 167 0.9781 0.9781 0.9781 137 0.9348 0.9372 0.9360 0.9873
0.0119 71.0 6816 0.0631 0.91 0.9681 0.9381 94 0.9390 0.9222 0.9305 167 0.9853 0.9781 0.9817 137 0.9475 0.9523 0.9499 0.9887
0.0111 72.0 6912 0.0621 0.91 0.9681 0.9381 94 0.9222 0.9222 0.9222 167 0.9852 0.9708 0.9779 137 0.9403 0.9497 0.9450 0.9881
0.0121 73.0 7008 0.0617 0.9271 0.9468 0.9368 94 0.9333 0.9222 0.9277 167 0.9779 0.9708 0.9744 137 0.9471 0.9447 0.9459 0.9890
0.0114 74.0 7104 0.0641 0.9091 0.9574 0.9326 94 0.9212 0.9102 0.9157 167 0.9853 0.9781 0.9817 137 0.94 0.9447 0.9424 0.9884
0.0109 75.0 7200 0.0660 0.91 0.9681 0.9381 94 0.9273 0.9162 0.9217 167 0.9781 0.9781 0.9781 137 0.9403 0.9497 0.9450 0.9878
0.0105 76.0 7296 0.0663 0.9368 0.9468 0.9418 94 0.9217 0.9162 0.9189 167 0.9853 0.9781 0.9817 137 0.9471 0.9447 0.9459 0.9878
0.01 77.0 7392 0.0678 0.9091 0.9574 0.9326 94 0.9146 0.8982 0.9063 167 0.9781 0.9781 0.9781 137 0.935 0.9397 0.9373 0.9865
0.0102 78.0 7488 0.0671 0.9175 0.9468 0.9319 94 0.9096 0.9042 0.9069 167 0.9926 0.9781 0.9853 137 0.9397 0.9397 0.9397 0.9876
0.0095 79.0 7584 0.0655 0.9278 0.9574 0.9424 94 0.9222 0.9222 0.9222 167 0.9852 0.9708 0.9779 137 0.9449 0.9472 0.9460 0.9890
0.0104 80.0 7680 0.0713 0.8544 0.9362 0.8934 94 0.9207 0.9042 0.9124 167 0.9779 0.9708 0.9744 137 0.9231 0.9347 0.9288 0.9859
0.011 81.0 7776 0.0678 0.9 0.9574 0.9278 94 0.9152 0.9042 0.9096 167 0.9926 0.9781 0.9853 137 0.9375 0.9422 0.9398 0.9876
0.0102 82.0 7872 0.0693 0.9 0.9574 0.9278 94 0.9212 0.9102 0.9157 167 0.9853 0.9781 0.9817 137 0.9377 0.9447 0.9412 0.9873
0.0092 83.0 7968 0.0683 0.8911 0.9574 0.9231 94 0.9096 0.9042 0.9069 167 0.9779 0.9708 0.9744 137 0.9280 0.9397 0.9338 0.9867
0.009 84.0 8064 0.0700 0.8911 0.9574 0.9231 94 0.9152 0.9042 0.9096 167 0.9853 0.9781 0.9817 137 0.9328 0.9422 0.9375 0.9867
0.0087 85.0 8160 0.0667 0.9278 0.9574 0.9424 94 0.9387 0.9162 0.9273 167 0.9853 0.9781 0.9817 137 0.9520 0.9472 0.9496 0.9887
0.0091 86.0 8256 0.0672 0.9192 0.9681 0.9430 94 0.9379 0.9042 0.9207 167 0.9781 0.9781 0.9781 137 0.9471 0.9447 0.9459 0.9887
0.0101 87.0 8352 0.0689 0.8824 0.9574 0.9184 94 0.9152 0.9042 0.9096 167 0.9779 0.9708 0.9744 137 0.9280 0.9397 0.9338 0.9867
0.009 88.0 8448 0.0680 0.9192 0.9681 0.9430 94 0.9264 0.9042 0.9152 167 0.9779 0.9708 0.9744 137 0.9422 0.9422 0.9422 0.9881
0.0102 89.0 8544 0.0705 0.8922 0.9681 0.9286 94 0.9198 0.8922 0.9058 167 0.9781 0.9781 0.9781 137 0.9327 0.9397 0.9362 0.9870
0.0082 90.0 8640 0.0693 0.9010 0.9681 0.9333 94 0.9146 0.8982 0.9063 167 0.9708 0.9708 0.9708 137 0.9303 0.9397 0.9350 0.9870
0.01 91.0 8736 0.0710 0.9010 0.9681 0.9333 94 0.9259 0.8982 0.9119 167 0.9781 0.9781 0.9781 137 0.9375 0.9422 0.9398 0.9876
0.0072 92.0 8832 0.0715 0.8824 0.9574 0.9184 94 0.9264 0.9042 0.9152 167 0.9781 0.9781 0.9781 137 0.9328 0.9422 0.9375 0.9873
0.007 93.0 8928 0.0720 0.9010 0.9681 0.9333 94 0.9321 0.9042 0.9179 167 0.9781 0.9781 0.9781 137 0.94 0.9447 0.9424 0.9876
0.0083 94.0 9024 0.0720 0.91 0.9681 0.9381 94 0.9375 0.8982 0.9174 167 0.9781 0.9781 0.9781 137 0.9446 0.9422 0.9434 0.9878
0.0085 95.0 9120 0.0705 0.91 0.9681 0.9381 94 0.9317 0.8982 0.9146 167 0.9781 0.9781 0.9781 137 0.9422 0.9422 0.9422 0.9876
0.0087 96.0 9216 0.0711 0.91 0.9681 0.9381 94 0.9321 0.9042 0.9179 167 0.9781 0.9781 0.9781 137 0.9424 0.9447 0.9435 0.9878
0.0079 97.0 9312 0.0725 0.91 0.9681 0.9381 94 0.925 0.8862 0.9052 167 0.9781 0.9781 0.9781 137 0.9395 0.9372 0.9384 0.9873
0.0072 98.0 9408 0.0718 0.91 0.9681 0.9381 94 0.9255 0.8922 0.9085 167 0.9781 0.9781 0.9781 137 0.9397 0.9397 0.9397 0.9876
0.0087 99.0 9504 0.0717 0.91 0.9681 0.9381 94 0.9255 0.8922 0.9085 167 0.9781 0.9781 0.9781 137 0.9397 0.9397 0.9397 0.9876
0.0086 100.0 9600 0.0714 0.91 0.9681 0.9381 94 0.9255 0.8922 0.9085 167 0.9781 0.9781 0.9781 137 0.9397 0.9397 0.9397 0.9876

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl10-0

Finetuned
(367)
this model