Transformers
Safetensors
bert
Inference Endpoints
philipphager's picture
Update README.md
f770d24 verified
|
raw
history blame
1.12 kB
metadata
license: mit
datasets:
  - philipphager/baidu-ultr-pretrain
  - philipphager/baidu-ultr_uva-mlm-ctr
metrics:
  - dcg@1
  - dcg@3
  - dcg@5
  - dcg@10
  - ndcg@10
  - mrr@10

Naive Listwise MonoBERT trained on Baidu-ULTR

A flax-based MonoBERT cross encoder trained on the Baidu-ULTR dataset with a listwise softmax cross-entropy loss on clicks. The loss is called "naive" as we use user clicks as a signal of relevance without any additional position bias correction. For more info, read our paper here.

Usage

from src.model import ListwiseCrossEncoder

model = ListwiseCrossEncoder.from_pretrained(
    "philipphager/baidu-ultr_uva-bert_naive-listwise",
)

model(batch)

Test Results on Baidu-ULTR Expert Annotations

Model log-likelihood DCG@1 DCG@3 DCG@5 DCG@10 nDCG@10 MRR@10
Naive Pointwise 0.2272 1.6836 3.5616 4.8822 7.4244 0.3640 0.6096
Naive Listwise - 1.9738 4.1609 5.6861 8.5432 0.4091 0.6436