Model Description
This model is a fine-tuned version of dslim/bert-base-NER on OntoNotes 5 dataset and is designed to identify and classify various types of entities in text, including persons, organizations, locations, dates, and more. It achieves the following results on the evaluation set:
- Loss: 0.1634
- Precision: 0.8620
- Recall: 0.8849
- F1: 0.8733
- Accuracy: 0.9758
Intended uses & limitations
The model is intended for use in applications requiring NER, such as information extraction, text classification, and enhancing search capabilities by identifying key entities within the text. It can be used to identify entities in any English text, including news articles, social media posts, and legal documents.
Training and evaluation data
Training Data The model was fine-tuned on the OntoNotes 5 dataset. This dataset includes multiple types of named entities and is widely used for NER tasks. The dataset is annotated with the following entity tags:
CARDINAL: Numerical values
DATE: References to dates and periods
PERSON: Names of people
NORP: Nationalities, religious groups, political groups
GPE: Countries, cities, states
LAW: Named documents and legal entities
ORG: Organizations
PERCENT: Percentage values
ORDINAL: Ordinal numbers
MONEY: Monetary values
WORK_OF_ART: Titles of creative works
FAC: Facilities
TIME: Times smaller than a day
LOC: Non-GPE locations, mountain ranges, bodies of water
QUANTITY: Measurements, as of weight or distance
PRODUCT: Objects, vehicles, foods, etc. (not services)
EVENT: Named events
LANGUAGE: Named languages
Model Configuration
Base Model: dslim/bert-base-NER
Number of Labels: 37 (including the "O" tag for outside any named entity)
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
0.0372 | 1.0 | 7491 | 0.1188 | 0.8392 | 0.8799 | 0.8591 | 0.9738 |
0.04 | 2.0 | 14982 | 0.1182 | 0.8562 | 0.8824 | 0.8691 | 0.9754 |
0.0164 | 3.0 | 22473 | 0.1380 | 0.8561 | 0.8835 | 0.8696 | 0.9752 |
0.0117 | 4.0 | 29964 | 0.1531 | 0.8618 | 0.8833 | 0.8724 | 0.9758 |
0.0054 | 5.0 | 37455 | 0.1634 | 0.8620 | 0.8849 | 0.8733 | 0.9758 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
Contact Information
For questions, comments, or issues with the model, please contact:
Name: [Irechukwu Nkweke]
Email: [[email protected]]
GitHub: [https://github.com/mnkweke]
Acknowledgments
This model was trained using the Hugging Face transformers library and the OntoNotes 5 dataset.
- Downloads last month
- 8
Model tree for IreNkweke/bert-finetuned-ner-ontonotes5
Base model
dslim/bert-base-NER