Edit model card

nerui-lora-r8-1

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0379
  • Location Precision: 0.9153
  • Location Recall: 0.9310
  • Location F1: 0.9231
  • Location Number: 116
  • Organization Precision: 0.9012
  • Organization Recall: 0.9241
  • Organization F1: 0.9125
  • Organization Number: 158
  • Person Precision: 0.984
  • Person Recall: 0.9919
  • Person F1: 0.9880
  • Person Number: 124
  • Overall Precision: 0.9309
  • Overall Recall: 0.9472
  • Overall F1: 0.9390
  • Overall Accuracy: 0.9868

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.1502 1.0 96 0.6915 0.0 0.0 0.0 116 0.0 0.0 0.0 158 0.0 0.0 0.0 124 0.0 0.0 0.0 0.8394
0.6681 2.0 192 0.5641 0.0 0.0 0.0 116 0.5 0.0063 0.0125 158 0.0 0.0 0.0 124 0.5 0.0025 0.005 0.8397
0.5591 3.0 288 0.4474 0.0 0.0 0.0 116 0.4286 0.0570 0.1006 158 0.2727 0.0484 0.0822 124 0.3333 0.0377 0.0677 0.8471
0.4414 4.0 384 0.3290 0.2692 0.0603 0.0986 116 0.3592 0.2342 0.2835 158 0.4071 0.4597 0.4318 124 0.3755 0.2538 0.3028 0.8847
0.3301 5.0 480 0.2424 0.4459 0.2845 0.3474 116 0.4874 0.6139 0.5434 158 0.5669 0.7177 0.6335 124 0.5093 0.5503 0.5290 0.9256
0.2536 6.0 576 0.1846 0.6372 0.6207 0.6288 116 0.6264 0.7215 0.6706 158 0.7347 0.8710 0.7970 124 0.6652 0.7387 0.7 0.9525
0.2029 7.0 672 0.1468 0.7328 0.7328 0.7328 116 0.6778 0.7722 0.7219 158 0.8676 0.9516 0.9077 124 0.7523 0.8166 0.7831 0.9629
0.1712 8.0 768 0.1217 0.7949 0.8017 0.7983 116 0.7356 0.8101 0.7711 158 0.9104 0.9839 0.9457 124 0.8071 0.8618 0.8335 0.9679
0.1504 9.0 864 0.1066 0.8220 0.8362 0.8291 116 0.7630 0.8354 0.7976 158 0.9173 0.9839 0.9494 124 0.8278 0.8819 0.8540 0.9717
0.1356 10.0 960 0.0944 0.8305 0.8448 0.8376 116 0.7917 0.8418 0.8160 158 0.9173 0.9839 0.9494 124 0.8425 0.8869 0.8641 0.9734
0.1276 11.0 1056 0.0848 0.8305 0.8448 0.8376 116 0.8084 0.8544 0.8308 158 0.9173 0.9839 0.9494 124 0.8493 0.8920 0.8701 0.9745
0.1202 12.0 1152 0.0797 0.8739 0.8966 0.8851 116 0.8313 0.8734 0.8519 158 0.9173 0.9839 0.9494 124 0.8708 0.9146 0.8922 0.9769
0.1131 13.0 1248 0.0725 0.8824 0.9052 0.8936 116 0.8274 0.8797 0.8528 158 0.9531 0.9839 0.9683 124 0.8819 0.9196 0.9004 0.9786
0.1074 14.0 1344 0.0678 0.8983 0.9138 0.9060 116 0.8625 0.8734 0.8679 158 0.9457 0.9839 0.9644 124 0.8993 0.9196 0.9093 0.9797
0.1046 15.0 1440 0.0671 0.8618 0.9138 0.8870 116 0.8383 0.8861 0.8615 158 0.9462 0.9919 0.9685 124 0.8786 0.9271 0.9022 0.9786
0.0992 16.0 1536 0.0648 0.8833 0.9138 0.8983 116 0.8393 0.8924 0.8650 158 0.9535 0.9919 0.9723 124 0.8873 0.9296 0.9080 0.9800
0.0972 17.0 1632 0.0611 0.9052 0.9052 0.9052 116 0.8805 0.8861 0.8833 158 0.9531 0.9839 0.9683 124 0.9107 0.9221 0.9164 0.9822
0.0908 18.0 1728 0.0583 0.8678 0.9052 0.8861 116 0.8720 0.9051 0.8882 158 0.9683 0.9839 0.976 124 0.9002 0.9296 0.9147 0.9822
0.089 19.0 1824 0.0568 0.8678 0.9052 0.8861 116 0.8805 0.8861 0.8833 158 0.9606 0.9839 0.9721 124 0.9017 0.9221 0.9118 0.9816
0.0872 20.0 1920 0.0591 0.8824 0.9052 0.8936 116 0.8462 0.9051 0.8746 158 0.9535 0.9919 0.9723 124 0.8897 0.9322 0.9104 0.9805
0.0863 21.0 2016 0.0565 0.8770 0.9224 0.8992 116 0.8421 0.9114 0.8754 158 0.9762 0.9919 0.9840 124 0.8926 0.9397 0.9155 0.9822
0.0834 22.0 2112 0.0545 0.8833 0.9138 0.8983 116 0.8471 0.9114 0.8780 158 0.9762 0.9919 0.9840 124 0.8966 0.9372 0.9165 0.9822
0.0795 23.0 2208 0.0511 0.8824 0.9052 0.8936 116 0.8667 0.9051 0.8854 158 0.9683 0.9839 0.976 124 0.9024 0.9296 0.9158 0.9835
0.0815 24.0 2304 0.0501 0.8898 0.9052 0.8974 116 0.8861 0.8861 0.8861 158 0.9762 0.9919 0.9840 124 0.9154 0.9246 0.92 0.9824
0.0764 25.0 2400 0.0491 0.8974 0.9052 0.9013 116 0.8727 0.9114 0.8916 158 0.9762 0.9919 0.9840 124 0.9118 0.9347 0.9231 0.9844
0.077 26.0 2496 0.0477 0.9052 0.9052 0.9052 116 0.8788 0.9177 0.8978 158 0.984 0.9919 0.9880 124 0.9187 0.9372 0.9279 0.9849
0.0749 27.0 2592 0.0504 0.9060 0.9138 0.9099 116 0.8855 0.9304 0.9074 158 0.9762 0.9919 0.9840 124 0.9193 0.9447 0.9318 0.9844
0.0728 28.0 2688 0.0490 0.8824 0.9052 0.8936 116 0.8667 0.9051 0.8854 158 0.984 0.9919 0.9880 124 0.9071 0.9322 0.9195 0.9841
0.0698 29.0 2784 0.0478 0.8824 0.9052 0.8936 116 0.8727 0.9114 0.8916 158 0.984 0.9919 0.9880 124 0.9095 0.9347 0.9219 0.9841
0.0694 30.0 2880 0.0466 0.9052 0.9052 0.9052 116 0.8683 0.9177 0.8923 158 0.9683 0.9839 0.976 124 0.9095 0.9347 0.9219 0.9846
0.0661 31.0 2976 0.0459 0.9060 0.9138 0.9099 116 0.8848 0.9241 0.9040 158 0.984 0.9919 0.9880 124 0.9214 0.9422 0.9317 0.9855
0.0672 32.0 3072 0.0454 0.8974 0.9052 0.9013 116 0.8659 0.8987 0.8820 158 0.984 0.9919 0.9880 124 0.9113 0.9296 0.9204 0.9849
0.0663 33.0 3168 0.0459 0.8974 0.9052 0.9013 116 0.8606 0.8987 0.8793 158 0.984 0.9919 0.9880 124 0.9091 0.9296 0.9193 0.9846
0.067 34.0 3264 0.0461 0.8824 0.9052 0.8936 116 0.8667 0.9051 0.8854 158 0.984 0.9919 0.9880 124 0.9071 0.9322 0.9195 0.9841
0.0628 35.0 3360 0.0449 0.9052 0.9052 0.9052 116 0.8675 0.9114 0.8889 158 0.984 0.9919 0.9880 124 0.9140 0.9347 0.9242 0.9852
0.0617 36.0 3456 0.0461 0.8992 0.9224 0.9106 116 0.8780 0.9114 0.8944 158 0.984 0.9919 0.9880 124 0.9167 0.9397 0.9280 0.9852
0.0617 37.0 3552 0.0432 0.8974 0.9052 0.9013 116 0.8788 0.9177 0.8978 158 0.984 0.9919 0.9880 124 0.9165 0.9372 0.9267 0.9855
0.0603 38.0 3648 0.0430 0.8992 0.9224 0.9106 116 0.8944 0.9114 0.9028 158 0.9683 0.9839 0.976 124 0.9187 0.9372 0.9279 0.9860
0.0617 39.0 3744 0.0413 0.8974 0.9052 0.9013 116 0.8727 0.9114 0.8916 158 0.984 0.9919 0.9880 124 0.9140 0.9347 0.9242 0.9852
0.0563 40.0 3840 0.0410 0.8983 0.9138 0.9060 116 0.8827 0.9051 0.8938 158 0.984 0.9919 0.9880 124 0.9185 0.9347 0.9265 0.9855
0.0579 41.0 3936 0.0427 0.9008 0.9397 0.9198 116 0.8938 0.9051 0.8994 158 0.984 0.9919 0.9880 124 0.9236 0.9422 0.9328 0.9857
0.0566 42.0 4032 0.0413 0.8926 0.9310 0.9114 116 0.8875 0.8987 0.8931 158 0.9683 0.9839 0.976 124 0.9140 0.9347 0.9242 0.9855
0.0578 43.0 4128 0.0422 0.9060 0.9138 0.9099 116 0.8944 0.9114 0.9028 158 0.9683 0.9839 0.976 124 0.9208 0.9347 0.9277 0.9860
0.0567 44.0 4224 0.0414 0.9068 0.9224 0.9145 116 0.9125 0.9241 0.9182 158 0.984 0.9919 0.9880 124 0.9330 0.9447 0.9388 0.9871
0.0568 45.0 4320 0.0400 0.8926 0.9310 0.9114 116 0.8994 0.9051 0.9022 158 0.9683 0.9839 0.976 124 0.9187 0.9372 0.9279 0.9860
0.053 46.0 4416 0.0409 0.9076 0.9310 0.9191 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9865
0.0536 47.0 4512 0.0408 0.9068 0.9224 0.9145 116 0.9018 0.9304 0.9159 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9865
0.0519 48.0 4608 0.0401 0.8917 0.9224 0.9068 116 0.8951 0.9177 0.9062 158 0.984 0.9919 0.9880 124 0.9214 0.9422 0.9317 0.9865
0.0539 49.0 4704 0.0401 0.9068 0.9224 0.9145 116 0.9125 0.9241 0.9182 158 0.984 0.9919 0.9880 124 0.9330 0.9447 0.9388 0.9865
0.0522 50.0 4800 0.0418 0.9008 0.9397 0.9198 116 0.9 0.9114 0.9057 158 0.984 0.9919 0.9880 124 0.9261 0.9447 0.9353 0.9865
0.0518 51.0 4896 0.0404 0.8992 0.9224 0.9106 116 0.9062 0.9177 0.9119 158 0.9683 0.9839 0.976 124 0.9235 0.9397 0.9315 0.9863
0.0503 52.0 4992 0.0393 0.9138 0.9138 0.9138 116 0.8916 0.9367 0.9136 158 0.984 0.9919 0.9880 124 0.9263 0.9472 0.9366 0.9868
0.0499 53.0 5088 0.0392 0.9237 0.9397 0.9316 116 0.8963 0.9304 0.9130 158 0.984 0.9919 0.9880 124 0.9312 0.9523 0.9416 0.9876
0.0498 54.0 5184 0.0393 0.9068 0.9224 0.9145 116 0.9018 0.9304 0.9159 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9871
0.0492 55.0 5280 0.0390 0.9068 0.9224 0.9145 116 0.9018 0.9304 0.9159 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9874
0.0503 56.0 5376 0.0399 0.9237 0.9397 0.9316 116 0.9024 0.9367 0.9193 158 0.984 0.9919 0.9880 124 0.9337 0.9548 0.9441 0.9876
0.0491 57.0 5472 0.0408 0.9237 0.9397 0.9316 116 0.9024 0.9367 0.9193 158 0.984 0.9919 0.9880 124 0.9337 0.9548 0.9441 0.9876
0.0492 58.0 5568 0.0387 0.9237 0.9397 0.9316 116 0.9024 0.9367 0.9193 158 0.984 0.9919 0.9880 124 0.9337 0.9548 0.9441 0.9882
0.0477 59.0 5664 0.0390 0.9237 0.9397 0.9316 116 0.9136 0.9367 0.9250 158 0.984 0.9919 0.9880 124 0.9383 0.9548 0.9465 0.9882
0.0489 60.0 5760 0.0385 0.9244 0.9483 0.9362 116 0.9074 0.9304 0.9187 158 0.984 0.9919 0.9880 124 0.9360 0.9548 0.9453 0.9879
0.0446 61.0 5856 0.0391 0.9160 0.9397 0.9277 116 0.9177 0.9177 0.9177 158 0.984 0.9919 0.9880 124 0.9378 0.9472 0.9425 0.9865
0.0463 62.0 5952 0.0402 0.9237 0.9397 0.9316 116 0.9245 0.9304 0.9274 158 0.984 0.9919 0.9880 124 0.9428 0.9523 0.9475 0.9871
0.0482 63.0 6048 0.0401 0.9316 0.9397 0.9356 116 0.9130 0.9304 0.9216 158 0.984 0.9919 0.9880 124 0.9404 0.9523 0.9463 0.9868
0.0455 64.0 6144 0.0387 0.9068 0.9224 0.9145 116 0.9231 0.9114 0.9172 158 0.984 0.9919 0.9880 124 0.9373 0.9397 0.9385 0.9863
0.0432 65.0 6240 0.0392 0.9231 0.9310 0.9270 116 0.9125 0.9241 0.9182 158 0.984 0.9919 0.9880 124 0.9378 0.9472 0.9425 0.9865
0.0484 66.0 6336 0.0392 0.9153 0.9310 0.9231 116 0.9068 0.9241 0.9154 158 0.984 0.9919 0.9880 124 0.9332 0.9472 0.9401 0.9865
0.044 67.0 6432 0.0385 0.9153 0.9310 0.9231 116 0.9068 0.9241 0.9154 158 0.984 0.9919 0.9880 124 0.9332 0.9472 0.9401 0.9865
0.0425 68.0 6528 0.0386 0.9237 0.9397 0.9316 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9333 0.9497 0.9415 0.9871
0.044 69.0 6624 0.0381 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9865
0.0447 70.0 6720 0.0381 0.9153 0.9310 0.9231 116 0.9074 0.9304 0.9187 158 0.984 0.9919 0.9880 124 0.9333 0.9497 0.9415 0.9868
0.0439 71.0 6816 0.0389 0.9153 0.9310 0.9231 116 0.9074 0.9304 0.9187 158 0.984 0.9919 0.9880 124 0.9333 0.9497 0.9415 0.9871
0.0426 72.0 6912 0.0383 0.9153 0.9310 0.9231 116 0.9006 0.9177 0.9091 158 0.984 0.9919 0.9880 124 0.9307 0.9447 0.9377 0.9860
0.0423 73.0 7008 0.0387 0.9153 0.9310 0.9231 116 0.9074 0.9304 0.9187 158 0.984 0.9919 0.9880 124 0.9333 0.9497 0.9415 0.9871
0.0427 74.0 7104 0.0385 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9871
0.044 75.0 7200 0.0387 0.9153 0.9310 0.9231 116 0.9018 0.9304 0.9159 158 0.984 0.9919 0.9880 124 0.9310 0.9497 0.9403 0.9871
0.0415 76.0 7296 0.0386 0.9153 0.9310 0.9231 116 0.9018 0.9304 0.9159 158 0.984 0.9919 0.9880 124 0.9310 0.9497 0.9403 0.9874
0.0421 77.0 7392 0.0385 0.9153 0.9310 0.9231 116 0.9068 0.9241 0.9154 158 0.984 0.9919 0.9880 124 0.9332 0.9472 0.9401 0.9874
0.0428 78.0 7488 0.0380 0.9153 0.9310 0.9231 116 0.9125 0.9241 0.9182 158 0.984 0.9919 0.9880 124 0.9355 0.9472 0.9413 0.9871
0.0414 79.0 7584 0.0385 0.9153 0.9310 0.9231 116 0.8896 0.9177 0.9034 158 0.984 0.9919 0.9880 124 0.9261 0.9447 0.9353 0.9865
0.0394 80.0 7680 0.0380 0.9153 0.9310 0.9231 116 0.8841 0.9177 0.9006 158 0.984 0.9919 0.9880 124 0.9238 0.9447 0.9342 0.9868
0.0402 81.0 7776 0.0385 0.9153 0.9310 0.9231 116 0.8963 0.9304 0.9130 158 0.984 0.9919 0.9880 124 0.9287 0.9497 0.9391 0.9876
0.0404 82.0 7872 0.0377 0.9153 0.9310 0.9231 116 0.8957 0.9241 0.9097 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9874
0.0407 83.0 7968 0.0381 0.9153 0.9310 0.9231 116 0.8957 0.9241 0.9097 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9871
0.0406 84.0 8064 0.0380 0.9153 0.9310 0.9231 116 0.8902 0.9241 0.9068 158 0.984 0.9919 0.9880 124 0.9263 0.9472 0.9366 0.9874
0.0425 85.0 8160 0.0381 0.9153 0.9310 0.9231 116 0.8963 0.9304 0.9130 158 0.984 0.9919 0.9880 124 0.9287 0.9497 0.9391 0.9876
0.0402 86.0 8256 0.0374 0.9153 0.9310 0.9231 116 0.9062 0.9177 0.9119 158 0.984 0.9919 0.9880 124 0.9330 0.9447 0.9388 0.9868
0.0402 87.0 8352 0.0379 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9871
0.0407 88.0 8448 0.0380 0.9153 0.9310 0.9231 116 0.8902 0.9241 0.9068 158 0.984 0.9919 0.9880 124 0.9263 0.9472 0.9366 0.9874
0.0385 89.0 8544 0.0379 0.9153 0.9310 0.9231 116 0.8957 0.9241 0.9097 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9871
0.0388 90.0 8640 0.0377 0.9153 0.9310 0.9231 116 0.8957 0.9241 0.9097 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9871
0.0406 91.0 8736 0.0380 0.9153 0.9310 0.9231 116 0.8957 0.9241 0.9097 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9871
0.0404 92.0 8832 0.0377 0.9153 0.9310 0.9231 116 0.8957 0.9241 0.9097 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9871
0.0409 93.0 8928 0.0377 0.9153 0.9310 0.9231 116 0.8951 0.9177 0.9062 158 0.984 0.9919 0.9880 124 0.9284 0.9447 0.9365 0.9865
0.0382 94.0 9024 0.0380 0.9153 0.9310 0.9231 116 0.8957 0.9241 0.9097 158 0.984 0.9919 0.9880 124 0.9286 0.9472 0.9378 0.9871
0.0409 95.0 9120 0.0379 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9868
0.0406 96.0 9216 0.0379 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9868
0.0413 97.0 9312 0.0378 0.9153 0.9310 0.9231 116 0.8951 0.9177 0.9062 158 0.984 0.9919 0.9880 124 0.9284 0.9447 0.9365 0.9865
0.0384 98.0 9408 0.0379 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9868
0.0394 99.0 9504 0.0379 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9868
0.0386 100.0 9600 0.0379 0.9153 0.9310 0.9231 116 0.9012 0.9241 0.9125 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9868

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r8-1

Finetuned
(367)
this model