-
A Survey on Hallucination in Large Language Models: Principles, Taxonomy, Challenges, and Open Questions
Paper • 2311.05232 • Published -
Lynx: An Open Source Hallucination Evaluation Model
Paper • 2407.08488 • Published -
RAGTruth: A Hallucination Corpus for Developing Trustworthy Retrieval-Augmented Language Models
Paper • 2401.00396 • Published • 3 -
MiniCheck: Efficient Fact-Checking of LLMs on Grounding Documents
Paper • 2404.10774 • Published • 2
rin2401
rin2401
·
AI & ML interests
None yet
Organizations
None yet
Collections
6
spaces
1
models
2
datasets
None public yet