File size: 3,510 Bytes
ab38512 0954063 ab38512 0954063 ab38512 0954063 ab38512 0954063 ab38512 0954063 ab38512 0954063 5211f31 0954063 5211f31 0954063 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 |
---
license: mit
base_model: dslim/bert-base-NER
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-finetuned-ner-ontonotes5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Model Description
This model is a fine-tuned version of [dslim/bert-base-NER](https://huggingface.co/dslim/bert-base-NER) on OntoNotes 5 dataset and is designed to identify and classify various types of entities in text, including persons, organizations, locations, dates, and more.
It achieves the following results on the evaluation set:
- Loss: 0.1634
- Precision: 0.8620
- Recall: 0.8849
- F1: 0.8733
- Accuracy: 0.9758
## Intended uses & limitations
The model is intended for use in applications requiring NER, such as information extraction, text classification, and enhancing search capabilities by identifying key entities within the text. It can be used to identify entities in any English text, including news articles, social media posts, and legal documents.
## Training and evaluation data
Training Data
The model was fine-tuned on the OntoNotes 5 dataset. This dataset includes multiple types of named entities and is widely used for NER tasks. The dataset is annotated with the following entity tags:
CARDINAL: Numerical values
DATE: References to dates and periods
PERSON: Names of people
NORP: Nationalities, religious groups, political groups
GPE: Countries, cities, states
LAW: Named documents and legal entities
ORG: Organizations
PERCENT: Percentage values
ORDINAL: Ordinal numbers
MONEY: Monetary values
WORK_OF_ART: Titles of creative works
FAC: Facilities
TIME: Times smaller than a day
LOC: Non-GPE locations, mountain ranges, bodies of water
QUANTITY: Measurements, as of weight or distance
PRODUCT: Objects, vehicles, foods, etc. (not services)
EVENT: Named events
LANGUAGE: Named languages
## Model Configuration
Base Model: dslim/bert-base-NER
Number of Labels: 37 (including the "O" tag for outside any named entity)
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0372 | 1.0 | 7491 | 0.1188 | 0.8392 | 0.8799 | 0.8591 | 0.9738 |
| 0.04 | 2.0 | 14982 | 0.1182 | 0.8562 | 0.8824 | 0.8691 | 0.9754 |
| 0.0164 | 3.0 | 22473 | 0.1380 | 0.8561 | 0.8835 | 0.8696 | 0.9752 |
| 0.0117 | 4.0 | 29964 | 0.1531 | 0.8618 | 0.8833 | 0.8724 | 0.9758 |
| 0.0054 | 5.0 | 37455 | 0.1634 | 0.8620 | 0.8849 | 0.8733 | 0.9758 |
### Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
## Contact Information
For questions, comments, or issues with the model, please contact:
Name: [Irechukwu Nkweke]
Email: [[email protected]]
GitHub: [https://github.com/mnkweke]
## Acknowledgments
This model was trained using the Hugging Face transformers library and the OntoNotes 5 dataset.
|