-
Self-Rewarding Language Models
Paper • 2401.10020 • Published • 143 -
Orion-14B: Open-source Multilingual Large Language Models
Paper • 2401.12246 • Published • 11 -
MambaByte: Token-free Selective State Space Model
Paper • 2401.13660 • Published • 50 -
MM-LLMs: Recent Advances in MultiModal Large Language Models
Paper • 2401.13601 • Published • 44
Collections
Discover the best community collections!
Collections including paper arxiv:2407.09450
-
What You Say = What You Want? Teaching Humans to Articulate Requirements for LLMs
Paper • 2409.08775 • Published -
OmniQuery: Contextually Augmenting Captured Multimodal Memory to Enable Personal Question Answering
Paper • 2409.08250 • Published • 1 -
Synthetic continued pretraining
Paper • 2409.07431 • Published • 1 -
WonderWorld: Interactive 3D Scene Generation from a Single Image
Paper • 2406.09394 • Published • 3
-
SpreadsheetLLM: Encoding Spreadsheets for Large Language Models
Paper • 2407.09025 • Published • 128 -
Human-like Episodic Memory for Infinite Context LLMs
Paper • 2407.09450 • Published • 60 -
RULE: Reliable Multimodal RAG for Factuality in Medical Vision Language Models
Paper • 2407.05131 • Published • 24 -
We-Math: Does Your Large Multimodal Model Achieve Human-like Mathematical Reasoning?
Paper • 2407.01284 • Published • 75
-
Human-like Episodic Memory for Infinite Context LLMs
Paper • 2407.09450 • Published • 60 -
MUSCLE: A Model Update Strategy for Compatible LLM Evolution
Paper • 2407.09435 • Published • 20 -
Refuse Whenever You Feel Unsafe: Improving Safety in LLMs via Decoupled Refusal Training
Paper • 2407.09121 • Published • 5 -
ChatQA 2: Bridging the Gap to Proprietary LLMs in Long Context and RAG Capabilities
Paper • 2407.14482 • Published • 24
-
MInference 1.0: Accelerating Pre-filling for Long-Context LLMs via Dynamic Sparse Attention
Paper • 2407.02490 • Published • 23 -
Can Few-shot Work in Long-Context? Recycling the Context to Generate Demonstrations
Paper • 2406.13632 • Published • 5 -
LongRAG: Enhancing Retrieval-Augmented Generation with Long-context LLMs
Paper • 2406.15319 • Published • 61 -
Make Your LLM Fully Utilize the Context
Paper • 2404.16811 • Published • 52