rttl-ai/BIOptimus v.0.4
Model Details
Model Description: BIOptimus v.0.4 model is a BERT-like model pre-trained on PubMed abstracts. It is a biomedical language model pre-trained using contextualized weight distillation and Curriculum Learning. This model achieves state-of-the-art performance on several biomedical NER datasets from BLURB benchmark.
- Developed by: rttl-ai
- Model Type: Language model
- Language(s): English
- License: Apache-2.0
- Resources for more information:
- It is introduced in the paper BIOptimus: Pre-training an Optimal Biomedical Language Model with Curriculum Learning for Named Entity Recognition (BioNLP workshop @ ACL 2023).
- arxiv
- arxiv
- More information is available in this repository.
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.