CoT-MAE MS-Marco Passage Reranker
CoT-MAE is a transformers based Mask Auto-Encoder pretraining architecture designed for Dense Passage Retrieval. CoT-MAE MS-Marco Passage Reranker is a reranker trained with CoT-MAE retriever mined MS-Marco hard negatives using Tevatron toolkit.
Details can be found in our paper and codes.
Paper: ConTextual Mask Auto-Encoder for Dense Passage Retrieval.
Code: caskcsg/ir/cotmae
Scores
MS-Marco Passage full-ranking + top-200 rerank
We first retrieve using CoT-MAE MS-Marco Passage Retriever (named cotmae_base_msmarco_retriever), then use reranker to re-score top-200 retrieval results. Performances are as follows.
MRR @10 | recall@1 | recall@50 | recall@200 | QueriesRanked |
---|---|---|---|---|
0.43884 | 0.304871 | 0.903582 | 0.956734 | 6980 |
Citations
If you find our work useful, please cite our paper.
@misc{https://doi.org/10.48550/arxiv.2208.07670,
doi = {10.48550/ARXIV.2208.07670},
url = {https://arxiv.org/abs/2208.07670},
author = {Wu, Xing and Ma, Guangyuan and Lin, Meng and Lin, Zijia and Wang, Zhongyuan and Hu, Songlin},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {ConTextual Mask Auto-Encoder for Dense Passage Retrieval},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
- Downloads last month
- 18
Inference API (serverless) does not yet support transformers models for this pipeline type.