marquesafonso's picture
try to add color syntax to readme code
dd61a25
|
raw
history blame
3.43 kB
metadata
license: mit

bertimbau-NER

This model card aims to simplify the use of the portuguese Bert, a.k.a, Bertimbau for the Named Entity Recognition task.

For this model card the we used the BERT-CRF (selective scenario, 5 classes) model available in the ner_evalutaion folder of the original Bertimbau repo.

Usage

# Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification

tokenizer = AutoTokenizer.from_pretrained("marquesafonso/bertimbau-large-ner")
model = AutoModelForTokenClassification.from_pretrained("marquesafonso/bertimbau-large-ner")

Example

from transformers import pipeline

pipe = pipeline("token-classification", model="marquesafonso/bertimbau-large-ner")

sentence = "Acima de Ederson, abaixo de Rúben Dias. É entre os dois jogadores do Manchester City que se vai colocar Gonçalo Ramos no ranking de vendas mais avultadas do Benfica."

result = pipe([sentence])

print(f"{sentence}\n{result}")

# Acima de Ederson, abaixo de Rúben Dias. É entre os dois jogadores do Manchester City que se vai colocar Gonçalo Ramos no ranking de vendas mais avultadas do Benfica.
# [[{'entity': 'B-PESSOA', 'score': 0.99976975, 'index': 4, 'word': 'Ed', 'start': 9, 'end': 11}, {'entity': 'I-PESSOA', 'score': 0.9941182, 'index': 5, 'word': '##erson', 'start': 11, 'end': 16}, {'entity': 'B-PESSOA', 'score': 0.9998306, 'index': 9, 'word': 'R', 'start': 28, 'end': 29}, {'entity': 'I-PESSOA', 'score': 0.9737293, 'index': 10, 'word': '##ú', 'start': 29, 'end': 30}, {'entity': 'I-PESSOA', 'score': 0.9944133, 'index': 11, 'word': '##ben', 'start': 30, 'end': 33}, {'entity': 'I-PESSOA', 'score': 0.9994117, 'index': 12, 'word': 'Dias', 'start': 34, 'end': 38}, {'entity': 'B-ORGANIZACAO', 'score': 0.94043595, 'index': 20, 'word': 'Manchester', 'start': 69, 'end': 79}, {'entity': 'I-ORGANIZACAO', 'score': 0.9870952, 'index': 21, 'word': 'City', 'start': 80, 'end': 84}, {'entity': 'B-PESSOA', 'score': 0.9997652, 'index': 26, 'word': 'Gonçalo', 'start': 104, 'end': 111}, {'entity': 'I-PESSOA', 'score': 0.9989994, 'index': 27, 'word': 'Ramos', 'start': 112, 'end': 117}, {'entity': 'B-ORGANIZACAO', 'score': 0.9033079, 'index': 37, 'word': 'Benfica', 'start': 157, 'end': 164}]]

Acknowledgements

This work is an adaptation of portuguese Bert, a.k.a, Bertimbau. You may check and/or cite their work:

@InProceedings{souza2020bertimbau,
    author="Souza, F{\'a}bio and Nogueira, Rodrigo and Lotufo, Roberto",
    editor="Cerri, Ricardo and Prati, Ronaldo C.",
    title="BERTimbau: Pretrained BERT Models for Brazilian Portuguese",
    booktitle="Intelligent Systems",
    year="2020",
    publisher="Springer International Publishing",
    address="Cham",
    pages="403--417",
    isbn="978-3-030-61377-8"
}


@article{souza2019portuguese,
    title={Portuguese Named Entity Recognition using BERT-CRF},
    author={Souza, F{\'a}bio and Nogueira, Rodrigo and Lotufo, Roberto},
    journal={arXiv preprint arXiv:1909.10649},
    url={http://arxiv.org/abs/1909.10649},
    year={2019}
}

Note that the authors - Fabio Capuano de Souza, Rodrigo Nogueira, Roberto de Alencar Lotufo - have used an MIT LICENSE for their work.