jjglilleberg's picture
Update README.md
6b55234
metadata
license: apache-2.0
tags:
  - generated_from_keras_callback
  - biology
  - medical
model-index:
  - name: jjglilleberg/bert-finetuned-ner-nbci-disease
    results: []
datasets:
  - ncbi_disease
language:
  - en
metrics:
  - seqeval
library_name: keras
pipeline_tag: token-classification

jjglilleberg/bert-finetuned-ner-nbci-disease

This model is a fine-tuned version of bert-base-cased on the NCBI Disease Dataset. It achieves the following results on the evaluation set:

  • Precision: 0.759090909090909,
  • Recall: 0.8487928843710292,
  • F1: 0.8014397120575885,
  • Number: 787,
  • Overall_precision: 0.759090909090909,
  • Overall_recall: 0.8487928843710292,
  • Overall_f1: 0.8014397120575885,
  • Overall_accuracy: 0.9824785260799204

Model description

More information needed

Intended uses & limitations

The intended use of this model is for Disease Name Recognition and Concept Normalization.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer:
    • 'name': 'AdamWeightDecay',
    • 'learning_rate':
      • 'class_name': 'PolynomialDecay',
      • 'config':
        • 'initial_learning_rate': 2e-05,
        • 'decay_steps': 1020,
        • 'end_learning_rate': 0.0,
        • 'power': 1.0,
        • 'cycle': False,
        • 'name': None
    • 'decay': 0.0,
    • 'beta_1': 0.9,
    • 'beta_2': 0.999,
    • 'epsilon': 1e-08,
    • 'amsgrad': False,
    • 'weight_decay_rate': 0.01
  • training_precision: mixed_float16

Training results

Train Loss Validation Loss Epoch
0.1281 0.0561 0
0.0372 0.0596 1
0.0211 0.0645 2

Framework versions

  • Transformers 4.28.0
  • TensorFlow 2.12.0
  • Datasets 2.11.0
  • Tokenizers 0.13.3