Model Card for Model ID
Fine-tuned distilbert model. Trained on train set of BC5CDR-chem dataset taken from BLURB.
Model Details
Model Sources [optional]
Training Details
Training Data
Train set of BC5CDR-chem dataset.
Training Procedure
Classical fine-tuning.
Training Hyperparameters
- Training regime: [More Information Needed]
learning_rate=5e-5 per_device_train_batch_size=16 per_device_eval_batch_size=16 num_train_epochs=3 weight_decay=0.01
Evaluation
Testing Data
Test set of BC5CDR-chem dataset.
Results
Precision: 0.89 Recall: 0.87 Micro-F1: 0.88
Environmental Impact
- Hardware Type: 1xRTX A4000
- Hours used: 00:07:00
- Downloads last month
- 3
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for kbulutozler/distilbert-base-uncased-FT-ner-BC5CDR-chem
Base model
distilbert/distilbert-base-uncased