--- language: - es license: apache-2.0 datasets: - eriktks/conll2002 metrics: - precision - recall - f1 - accuracy pipeline_tag: token-classification --- # Model Name: bert-finetuned-ner-1 This is a BERT model fine-tuned for Named Entity Recognition (NER). # Model Description This is a fine-tuned BERT model for Named Entity Recognition (NER) task using CONLL2002 dataset. In the first part, the dataset must be pre-processed in order to give it to the model. This is done using the šŸ¤— Transformers and BERT tokenizers. Once this is done, finetuning is applied from *[bert-base-cased](https://huggingface.co/google-bert/bert-base-cased)* and using the šŸ¤— *AutoModelForTokenClassification*. Finally, the model is trained obtaining the neccesary metrics for evaluating its performance (Precision, Recall, F1 and Accuracy) # Training ## Training Details - Epochs: 5 - Learning Rate: 2e-05 - Weight Decay: 0.01 - Batch Size (Train): 16 - Batch Size (Eval): 8 ## Training Metrics | Epoch | Training Loss | Validation Loss | Precision | Recall | F1 Score | Accuracy | |:----:|:-------------:|:---------------:|:---------:|:------:|:--------:|:--------:| | 1 | 0.1735| 0.1508 | 0.6577 | 0.7323 | 0.6930 | 0.9586 | | 2 | 0.0770| 0.1421 | 0.6876 | 0.7702 | 0.7266 | 0.9629 | | 3 | 0.0504| 0.1373 | 0.7353 | 0.7845 | 0.7591 | 0.9663 | | 4 | 0.0358| 0.1442 | 0.7453 | 0.7902 | 0.7671 | 0.9664 | | 5 | 0.0272| 0.1536 | 0.7527 | 0.7946 | 0.7731 | 0.9667 | # Authors Made by: - Paul Rodrigo Rojas Guerrero - Jose Luis Hincapie Bucheli - SebastiĆ”n Idrobo Avirama With help from: - [RaĆŗl Ernesto GutiĆ©rrez](https://huggingface.co/raulgdp)