--- library_name: transformers tags: - BC2GM - NER license: apache-2.0 language: - en metrics: - seqeval base_model: - distilbert/distilbert-base-uncased --- # Model Card for Model ID Fine-tuned distilbert model. Trained on train set of BC2GM dataset taken from [BLURB](https://microsoft.github.io/BLURB/tasks.html). ## Model Details ### Model Sources [optional] - **Repository:** https://github.com/kbulutozler/medical-llm-benchmark ## Training Details ### Training Data Train set of BC2GM dataset. ### Training Procedure Classical fine-tuning. #### Training Hyperparameters - **Training regime:** [More Information Needed] learning_rate=5e-5 per_device_train_batch_size=16 per_device_eval_batch_size=16 num_train_epochs=3 weight_decay=0.01 ## Evaluation #### Testing Data Test set of BC2GM dataset. ### Results Precision: 0.76 Recall: 0.79 Micro-F1: 0.77 ## Environmental Impact - **Hardware Type:** 1xRTX A4000 - **Hours used:** 00:10:00