RARe: Retrieval Augmented Retrieval with In-Context Examples
Abstract
We investigate whether in-context examples, widely used in decoder-only language models (LLMs), can improve embedding model performance in retrieval tasks. Unlike in LLMs, naively prepending in-context examples (query-document pairs) to the target query at inference time does not work out of the box. We introduce a simple approach to enable retrievers to use in-context examples. Our approach, RARe, finetunes a pre-trained model with in-context examples whose query is semantically similar to the target query. This can be applied to adapt various base architectures (i.e., decoder-only language models, retriever models) and consistently achieves performance gains of up to +2.72% nDCG across various open-domain retrieval datasets (BeIR, RAR-b). In particular, we find RARe exhibits stronger out-of-domain generalization compared to models using queries without in-context examples, similar to what is seen for in-context learning in LLMs. We further provide analysis on the design choices of in-context example augmentation and lay the foundation for future work in this space.
Community
We introduce RARe: Retrieval Augmented Retrieval with In-Context Examples -- an approach that finetunes models with semantically similar in-context examples to boost retrieval performance.
Présente
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Making Text Embedders Few-Shot Learners (2024)
- Zero-Shot Dense Retrieval with Embeddings from Relevance Feedback (2024)
- Disentangling Questions from Query Generation for Task-Adaptive Retrieval (2024)
- jina-embeddings-v3: Multilingual Embeddings With Task LoRA (2024)
- VLM2Vec: Training Vision-Language Models for Massive Multimodal Embedding Tasks (2024)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 1
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper