Edit model card

bert-base-NER-finetuned-ner

This model is a fine-tuned version of dslim/bert-base-NER on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9704
  • 0 Precision: 0.9706
  • 0 Recall: 0.9413
  • 0 F1-score: 0.9558
  • 1 Precision: 0.8027
  • 1 Recall: 0.9205
  • 1 F1-score: 0.8575
  • 2 Precision: 0.7853
  • 2 Recall: 0.8165
  • 2 F1-score: 0.8006
  • 3 Precision: 0.7817
  • 3 Recall: 0.8603
  • 3 F1-score: 0.8191
  • Accuracy: 0.9272
  • Macro avg Precision: 0.8351
  • Macro avg Recall: 0.8847
  • Macro avg F1-score: 0.8583
  • Weighted avg Precision: 0.9313
  • Weighted avg Recall: 0.9272
  • Weighted avg F1-score: 0.9285

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss 0 Precision 0 Recall 0 F1-score 1 Precision 1 Recall 1 F1-score 2 Precision 2 Recall 2 F1-score 3 Precision 3 Recall 3 F1-score Accuracy Macro avg Precision Macro avg Recall Macro avg F1-score Weighted avg Precision Weighted avg Recall Weighted avg F1-score
No log 1.0 67 0.3241 0.9901 0.8116 0.8920 0.5586 0.9694 0.7088 0.4424 0.8807 0.5890 0.6615 0.8696 0.7514 0.8343 0.6631 0.8829 0.7353 0.8979 0.8343 0.8495
No log 2.0 134 0.3219 0.9882 0.8544 0.9164 0.6282 0.9480 0.7556 0.5375 0.8318 0.6531 0.6460 0.9106 0.7558 0.8665 0.7000 0.8862 0.7702 0.9064 0.8665 0.8763
No log 3.0 201 0.3126 0.9927 0.8353 0.9072 0.5873 0.9725 0.7323 0.5213 0.8624 0.6498 0.6578 0.9199 0.7671 0.8561 0.6898 0.8975 0.7641 0.9062 0.8561 0.8677
No log 4.0 268 0.3805 0.9851 0.8936 0.9371 0.7105 0.9419 0.8100 0.6166 0.8410 0.7115 0.7001 0.9218 0.7958 0.8979 0.7531 0.8996 0.8136 0.9196 0.8979 0.9035
No log 5.0 335 0.4058 0.9839 0.9028 0.9416 0.6786 0.9587 0.7947 0.6887 0.8593 0.7646 0.7718 0.9069 0.8339 0.9064 0.7807 0.9069 0.8337 0.9246 0.9064 0.9110
No log 6.0 402 0.4349 0.9833 0.9130 0.9468 0.7246 0.9373 0.8173 0.6786 0.8716 0.7631 0.7649 0.9088 0.8306 0.9130 0.7878 0.9077 0.8395 0.9275 0.9130 0.9169
No log 7.0 469 0.4379 0.9839 0.9184 0.9500 0.7308 0.9465 0.8248 0.7072 0.8716 0.7808 0.7755 0.9069 0.8361 0.9179 0.7994 0.9108 0.8479 0.9308 0.9179 0.9214
0.2085 8.0 536 0.4750 0.9862 0.8964 0.9391 0.6702 0.9694 0.7925 0.7038 0.8502 0.7701 0.7369 0.9181 0.8176 0.9028 0.7743 0.9085 0.8298 0.9236 0.9028 0.9079
0.2085 9.0 603 0.5353 0.9817 0.9225 0.9512 0.7443 0.9526 0.8357 0.7342 0.8532 0.7893 0.7751 0.9050 0.8351 0.9207 0.8088 0.9083 0.8528 0.9315 0.9207 0.9236
0.2085 10.0 670 0.5730 0.9786 0.9325 0.9550 0.7920 0.9434 0.8611 0.7413 0.8502 0.7920 0.7722 0.8901 0.8270 0.9263 0.8211 0.9040 0.8588 0.9338 0.9263 0.9285
0.2085 11.0 737 0.5801 0.9787 0.9199 0.9484 0.7404 0.9419 0.8291 0.7199 0.8410 0.7757 0.7679 0.8994 0.8285 0.9166 0.8017 0.9005 0.8454 0.9276 0.9166 0.9197
0.2085 12.0 804 0.7227 0.9675 0.9526 0.96 0.8496 0.8807 0.8649 0.8170 0.7920 0.8043 0.7921 0.8939 0.8399 0.9337 0.8565 0.8798 0.8673 0.9356 0.9337 0.9343
0.2085 13.0 871 0.6296 0.9744 0.9421 0.9579 0.8331 0.9159 0.8725 0.7568 0.8471 0.7994 0.7794 0.8883 0.8303 0.9309 0.8359 0.8983 0.8650 0.9356 0.9309 0.9325
0.2085 14.0 938 0.7074 0.9728 0.9397 0.9559 0.8070 0.9205 0.8600 0.7690 0.8349 0.8006 0.7804 0.8734 0.8243 0.9278 0.8323 0.8921 0.8602 0.9326 0.9278 0.9293
0.0385 15.0 1005 0.7392 0.9714 0.9441 0.9576 0.8425 0.8914 0.8663 0.7634 0.8287 0.7947 0.7721 0.8957 0.8293 0.9299 0.8373 0.8900 0.8620 0.9340 0.9299 0.9313
0.0385 16.0 1072 0.7589 0.9741 0.9399 0.9567 0.8003 0.9190 0.8555 0.7604 0.8349 0.7959 0.7876 0.8771 0.8300 0.9281 0.8306 0.8927 0.8595 0.9331 0.9281 0.9297
0.0385 17.0 1139 0.7045 0.9724 0.9380 0.9549 0.7847 0.9251 0.8491 0.7624 0.8440 0.8012 0.8056 0.8641 0.8338 0.9266 0.8313 0.8928 0.8597 0.9318 0.9266 0.9282
0.0385 18.0 1206 0.7735 0.9698 0.9437 0.9566 0.8043 0.9174 0.8571 0.7721 0.8287 0.7994 0.8074 0.8510 0.8286 0.9286 0.8384 0.8852 0.8604 0.9322 0.9286 0.9298
0.0385 19.0 1273 0.7184 0.9735 0.9399 0.9564 0.8150 0.9159 0.8625 0.7439 0.8440 0.7908 0.7863 0.8771 0.8292 0.9282 0.8297 0.8942 0.8597 0.9332 0.9282 0.9298
0.0385 20.0 1340 0.7814 0.9741 0.9341 0.9537 0.7875 0.9235 0.8501 0.7535 0.8226 0.7865 0.7581 0.8696 0.8101 0.9229 0.8183 0.8875 0.8501 0.9293 0.9229 0.9249
0.0385 21.0 1407 0.8279 0.9696 0.9445 0.9569 0.8201 0.9128 0.8640 0.7768 0.8196 0.7976 0.7880 0.8585 0.8217 0.9289 0.8386 0.8838 0.8601 0.9323 0.9289 0.9301
0.0385 22.0 1474 0.7268 0.9724 0.9332 0.9524 0.7704 0.9388 0.8463 0.7647 0.8349 0.7982 0.7818 0.8473 0.8132 0.9224 0.8223 0.8885 0.8525 0.9287 0.9224 0.9243
0.0127 23.0 1541 0.8197 0.9698 0.9445 0.9570 0.8078 0.9190 0.8598 0.7928 0.8073 0.8 0.7973 0.8641 0.8293 0.9294 0.8419 0.8837 0.8615 0.9327 0.9294 0.9305
0.0127 24.0 1608 0.8221 0.9722 0.9447 0.9582 0.8197 0.9037 0.8596 0.7718 0.8379 0.8035 0.7933 0.8790 0.8339 0.9307 0.8392 0.8913 0.8638 0.9344 0.9307 0.9320
0.0127 25.0 1675 0.8098 0.9735 0.9373 0.9550 0.7766 0.9358 0.8488 0.7928 0.8073 0.8 0.7809 0.8696 0.8229 0.9257 0.8310 0.8875 0.8567 0.9314 0.9257 0.9274
0.0127 26.0 1742 0.8023 0.9710 0.9404 0.9554 0.7897 0.9358 0.8565 0.7813 0.8196 0.8 0.8035 0.8529 0.8275 0.9275 0.8364 0.8872 0.8599 0.9319 0.9275 0.9288
0.0127 27.0 1809 0.7750 0.9748 0.9373 0.9557 0.7897 0.9358 0.8565 0.7591 0.8287 0.7924 0.7963 0.8808 0.8364 0.9276 0.8300 0.8957 0.8603 0.9333 0.9276 0.9293
0.0127 28.0 1876 0.9205 0.9673 0.9465 0.9568 0.8220 0.9037 0.8609 0.7861 0.7982 0.7921 0.7925 0.8603 0.8250 0.9288 0.8420 0.8772 0.8587 0.9314 0.9288 0.9297
0.0127 29.0 1943 0.7887 0.9726 0.9376 0.9548 0.7695 0.9343 0.8439 0.7756 0.8349 0.8041 0.8057 0.8492 0.8268 0.9256 0.8308 0.8890 0.8574 0.9311 0.9256 0.9273
0.0052 30.0 2010 0.8106 0.9778 0.9371 0.9570 0.7861 0.9327 0.8531 0.7658 0.8502 0.8058 0.7897 0.8883 0.8361 0.9288 0.8299 0.9021 0.8630 0.9351 0.9288 0.9307
0.0052 31.0 2077 0.8659 0.9699 0.9421 0.9558 0.8022 0.9113 0.8533 0.7929 0.8196 0.8060 0.7922 0.8734 0.8308 0.9281 0.8393 0.8866 0.8615 0.9319 0.9281 0.9293
0.0052 32.0 2144 0.8154 0.9722 0.9389 0.9553 0.7878 0.9251 0.8509 0.7768 0.8410 0.8076 0.7986 0.8641 0.8301 0.9272 0.8339 0.8923 0.8610 0.9321 0.9272 0.9287
0.0052 33.0 2211 0.8569 0.9727 0.9432 0.9577 0.8086 0.9174 0.8596 0.7878 0.8287 0.8077 0.7953 0.8827 0.8367 0.9307 0.8411 0.8930 0.8654 0.9347 0.9307 0.9320
0.0052 34.0 2278 0.8868 0.9705 0.9432 0.9566 0.8011 0.9113 0.8526 0.7843 0.8226 0.8030 0.7976 0.8659 0.8304 0.9285 0.8384 0.8858 0.8607 0.9323 0.9285 0.9298
0.0052 35.0 2345 0.8586 0.9745 0.9412 0.9575 0.8021 0.9235 0.8586 0.7771 0.8318 0.8035 0.79 0.8827 0.8338 0.9298 0.8359 0.8948 0.8634 0.9346 0.9298 0.9313
0.0052 36.0 2412 0.9288 0.9698 0.9449 0.9572 0.8157 0.9067 0.8588 0.7864 0.8104 0.7982 0.7825 0.8641 0.8212 0.9286 0.8386 0.8815 0.8588 0.9320 0.9286 0.9298
0.0052 37.0 2479 0.9396 0.9684 0.9460 0.9570 0.8186 0.9037 0.8590 0.7824 0.8135 0.7976 0.7917 0.8566 0.8229 0.9288 0.8403 0.8799 0.8591 0.9317 0.9288 0.9298
0.0032 38.0 2546 0.9108 0.9706 0.9408 0.9555 0.8014 0.9067 0.8508 0.7743 0.8287 0.8006 0.7862 0.8696 0.8258 0.9268 0.8331 0.8865 0.8582 0.9310 0.9268 0.9282
0.0032 39.0 2613 0.8132 0.9757 0.9306 0.9526 0.7853 0.9174 0.8463 0.7249 0.8379 0.7773 0.7700 0.8976 0.8289 0.9224 0.8140 0.8959 0.8513 0.9299 0.9224 0.9247
0.0032 40.0 2680 0.9634 0.9692 0.9421 0.9554 0.8033 0.9052 0.8512 0.7876 0.8165 0.8018 0.7825 0.8641 0.8212 0.9266 0.8356 0.8820 0.8574 0.9304 0.9266 0.9279
0.0032 41.0 2747 0.9024 0.9711 0.9387 0.9546 0.7937 0.9174 0.8511 0.7655 0.8287 0.7959 0.7840 0.8585 0.8196 0.9253 0.8286 0.8858 0.8553 0.9301 0.9253 0.9269
0.0032 42.0 2814 0.9623 0.9682 0.9456 0.9567 0.8217 0.9021 0.8601 0.7922 0.8043 0.7982 0.7795 0.8622 0.8187 0.9283 0.8404 0.8786 0.8584 0.9314 0.9283 0.9294
0.0032 43.0 2881 0.9335 0.9692 0.9441 0.9565 0.8148 0.9083 0.8590 0.7811 0.8073 0.7940 0.7817 0.8603 0.8191 0.9278 0.8367 0.8800 0.8572 0.9312 0.9278 0.9290
0.0032 44.0 2948 0.8909 0.9714 0.9380 0.9544 0.7924 0.9220 0.8523 0.7642 0.8226 0.7923 0.7817 0.8603 0.8191 0.9250 0.8274 0.8857 0.8546 0.9300 0.9250 0.9266
0.0026 45.0 3015 0.9011 0.9711 0.9393 0.9549 0.7900 0.9205 0.8503 0.7876 0.8165 0.8018 0.7811 0.8641 0.8205 0.9259 0.8325 0.8851 0.8569 0.9306 0.9259 0.9274
0.0026 46.0 3082 0.9105 0.9709 0.9387 0.9546 0.7921 0.9205 0.8515 0.7801 0.8135 0.7964 0.7785 0.8641 0.8191 0.9253 0.8304 0.8842 0.8554 0.9301 0.9253 0.9268
0.0026 47.0 3149 0.9380 0.9698 0.9404 0.9549 0.7936 0.9113 0.8484 0.7811 0.8073 0.7940 0.7808 0.8622 0.8195 0.9253 0.8313 0.8803 0.8542 0.9296 0.9253 0.9267
0.0026 48.0 3216 0.9258 0.9702 0.9393 0.9545 0.7846 0.9190 0.8465 0.7843 0.8226 0.8030 0.7849 0.8492 0.8157 0.9249 0.8310 0.8825 0.8549 0.9295 0.9249 0.9264
0.0026 49.0 3283 0.9463 0.9697 0.9404 0.9548 0.7918 0.9128 0.8480 0.7836 0.8196 0.8012 0.7880 0.8585 0.8217 0.9257 0.8333 0.8828 0.8564 0.9300 0.9257 0.9271
0.0026 50.0 3350 0.9205 0.9708 0.9406 0.9555 0.7939 0.9190 0.8519 0.7895 0.8257 0.8072 0.7836 0.8566 0.8185 0.9266 0.8345 0.8855 0.8583 0.9310 0.9266 0.9280
0.0026 51.0 3417 0.9339 0.9702 0.9412 0.9555 0.8024 0.9128 0.8541 0.7872 0.8257 0.8060 0.7808 0.8622 0.8195 0.9269 0.8352 0.8855 0.8587 0.9310 0.9269 0.9283
0.0026 52.0 3484 0.9439 0.9712 0.9413 0.9560 0.7995 0.9205 0.8557 0.7959 0.8226 0.8090 0.7808 0.8622 0.8195 0.9276 0.8368 0.8867 0.8601 0.9319 0.9276 0.9290
0.0013 53.0 3551 0.9354 0.9715 0.9406 0.9558 0.7974 0.9266 0.8571 0.7855 0.8287 0.8065 0.7863 0.8566 0.8200 0.9275 0.8352 0.8881 0.8599 0.9319 0.9275 0.9289
0.0013 54.0 3618 0.9541 0.9715 0.9404 0.9557 0.7992 0.9251 0.8575 0.7832 0.8287 0.8053 0.7840 0.8585 0.8196 0.9273 0.8345 0.8882 0.8595 0.9318 0.9273 0.9288
0.0013 55.0 3685 0.9586 0.9715 0.9402 0.9556 0.7984 0.9266 0.8577 0.7820 0.8226 0.8018 0.7810 0.8566 0.8171 0.9269 0.8332 0.8865 0.8581 0.9314 0.9269 0.9284
0.0013 56.0 3752 0.9737 0.9690 0.9413 0.9549 0.8005 0.9083 0.8510 0.7853 0.8165 0.8006 0.7814 0.8585 0.8181 0.9259 0.8340 0.8811 0.8562 0.9298 0.9259 0.9272
0.0013 57.0 3819 0.9620 0.9695 0.9404 0.9547 0.7997 0.9098 0.8512 0.7807 0.8165 0.7982 0.7795 0.8622 0.8187 0.9256 0.8323 0.8822 0.8557 0.9298 0.9256 0.9270
0.0013 58.0 3886 0.9616 0.9697 0.9404 0.9548 0.7997 0.9159 0.8539 0.7853 0.8165 0.8006 0.7787 0.8585 0.8167 0.9259 0.8334 0.8828 0.8565 0.9301 0.9259 0.9273
0.0013 59.0 3953 0.9692 0.9701 0.9412 0.9554 0.8021 0.9174 0.8559 0.7830 0.8165 0.7994 0.7814 0.8585 0.8181 0.9266 0.8341 0.8834 0.8572 0.9307 0.9266 0.9280
0.001 60.0 4020 0.9704 0.9706 0.9413 0.9558 0.8027 0.9205 0.8575 0.7853 0.8165 0.8006 0.7817 0.8603 0.8191 0.9272 0.8351 0.8847 0.8583 0.9313 0.9272 0.9285

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
7
Safetensors
Model size
108M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for antoineedy/bert-base-NER-finetuned-ner

Finetuned
(16)
this model