--- library_name: transformers tags: - BC5CDR-chem - NER license: apache-2.0 language: - en metrics: - seqeval base_model: - distilbert/distilbert-base-uncased --- # Model Card for Model ID Fine-tuned distilbert model. Trained on train set of BC5CDR-chem dataset taken from [BLURB](https://microsoft.github.io/BLURB/tasks.html). ## Model Details ### Model Sources [optional] - **Repository:** https://github.com/kbulutozler/medical-llm-benchmark ## Training Details ### Training Data Train set of BC5CDR-chem dataset. ### Training Procedure Classical fine-tuning. #### Training Hyperparameters - **Training regime:** [More Information Needed] learning_rate=5e-5 per_device_train_batch_size=16 per_device_eval_batch_size=16 num_train_epochs=3 weight_decay=0.01 ## Evaluation #### Testing Data Test set of BC5CDR-chem dataset. ### Results Precision: 0.89 Recall: 0.87 Micro-F1: 0.88 ## Environmental Impact - **Hardware Type:** 1xRTX A4000 - **Hours used:** 00:07:00