File size: 374 Bytes
51848a7
 
 
 
 
90f2d4e
 
d3eef20
 
 
 
1
2
3
4
5
6
7
8
9
10
11
---
datasets:
- BeIR/msmarco
language:
- en
---

This model uses two BERT models fine-tuned on a contrastive learning objective.
One is responsable for short queries, and the other for longer documents that contain the answer to the query.
After encoding many documents, one may perform a nearest neighbor search with the query encoding, to fetch
the most relevant document.