fl399 commited on
Commit
0e85bb7
1 Parent(s): 1265ee8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -15
README.md CHANGED
@@ -18,20 +18,11 @@ SapBERT [(Liu et al. 2020)](https://arxiv.org/pdf/2010.11784.pdf) trained with [
18
  ### Citation
19
 
20
  ```bibtex
21
- @inproceedings{liu-etal-2021-self,
22
- title = "Self-Alignment Pretraining for Biomedical Entity Representations",
23
- author = "Liu, Fangyu and
24
- Shareghi, Ehsan and
25
- Meng, Zaiqiao and
26
- Basaldella, Marco and
27
- Collier, Nigel",
28
- booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
29
- month = jun,
30
- year = "2021",
31
- address = "Online",
32
- publisher = "Association for Computational Linguistics",
33
- url = "https://www.aclweb.org/anthology/2021.naacl-main.334",
34
- pages = "4228--4238",
35
- abstract = "Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.",
36
  }
37
  ```
 
18
  ### Citation
19
 
20
  ```bibtex
21
+ @inproceedings{liu2021learning,
22
+ title={Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking},
23
+ author={Liu, Fangyu and Vuli{\'c}, Ivan and Korhonen, Anna and Collier, Nigel},
24
+ booktitle={Proceedings of ACL-IJCNLP 2021},
25
+ month = aug,
26
+ year={2021}
 
 
 
 
 
 
 
 
 
27
  }
28
  ```