Collections
Discover the best community collections!
Collections including paper arxiv:2311.16079
-
Attention Is All You Need
Paper • 1706.03762 • Published • 41 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 14 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper • 1907.11692 • Published • 7 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper • 1910.01108 • Published • 14
-
DILA: Dictionary Label Attention for Mechanistic Interpretability in High-dimensional Multi-label Medical Coding Prediction
Paper • 2409.10504 • Published • 1 -
MEDIC: Towards a Comprehensive Framework for Evaluating LLMs in Clinical Applications
Paper • 2409.07314 • Published • 50 -
MultiMed: Massively Multimodal and Multitask Medical Understanding
Paper • 2408.12682 • Published -
Towards Evaluating and Building Versatile Large Language Models for Medicine
Paper • 2408.12547 • Published