Update README.md
Browse files
README.md
CHANGED
@@ -18,11 +18,20 @@ SapBERT [(Liu et al. 2020)](https://arxiv.org/pdf/2010.11784.pdf) trained with [
|
|
18 |
### Citation
|
19 |
|
20 |
```bibtex
|
21 |
-
@
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
}
|
27 |
-
|
28 |
```
|
|
|
18 |
### Citation
|
19 |
|
20 |
```bibtex
|
21 |
+
@inproceedings{liu-etal-2021-self,
|
22 |
+
title = "Self-Alignment Pretraining for Biomedical Entity Representations",
|
23 |
+
author = "Liu, Fangyu and
|
24 |
+
Shareghi, Ehsan and
|
25 |
+
Meng, Zaiqiao and
|
26 |
+
Basaldella, Marco and
|
27 |
+
Collier, Nigel",
|
28 |
+
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
|
29 |
+
month = jun,
|
30 |
+
year = "2021",
|
31 |
+
address = "Online",
|
32 |
+
publisher = "Association for Computational Linguistics",
|
33 |
+
url = "https://www.aclweb.org/anthology/2021.naacl-main.334",
|
34 |
+
pages = "4228--4238",
|
35 |
+
abstract = "Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.",
|
36 |
}
|
|
|
37 |
```
|