dual-bert-IR / README.md
irow's picture
Update README.md
d3eef20
|
raw
history blame
374 Bytes
---
datasets:
- BeIR/msmarco
language:
- en
---
This model uses two BERT models fine-tuned on a contrastive learning objective.
One is responsable for short queries, and the other for longer documents that contain the answer to the query.
After encoding many documents, one may perform a nearest neighbor search with the query encoding, to fetch
the most relevant document.