Transformers
Safetensors
bert
Inference Endpoints
philipphager commited on
Commit
783bf92
1 Parent(s): c2b9491

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -14,7 +14,7 @@ metrics:
14
  ---
15
 
16
  # Listwise MonoBERT trained on Baidu-ULTR using the Dual Learning Algorithm (DLA)
17
- A flax-based MonoBERT cross encoder trained on the [Baidu-ULTR](https://arxiv.org/abs/2207.03051) dataset with a **listwise DLA objective on clicks**. Following, [Ai et al.](https://arxiv.org/abs/1804.05938), the dual learning algorithm jointly infers item relevance (using a BERT model) and position bias (in our case, a single embedding parameter per rank), both by optimizing a **listwise softmax cross-entropy loss**. For more info, [read our paper](https://arxiv.org/abs/2404.02543) and [find the code for this model here](https://github.com/philipphager/baidu-bert-model).
18
 
19
  ## Test Results on Baidu-ULTR Expert Annotations
20
 
 
14
  ---
15
 
16
  # Listwise MonoBERT trained on Baidu-ULTR using the Dual Learning Algorithm (DLA)
17
+ A flax-based MonoBERT cross encoder trained on the [Baidu-ULTR](https://arxiv.org/abs/2207.03051) dataset with a **listwise DLA objective on clicks**. Following [Ai et al.](https://arxiv.org/abs/1804.05938), the dual learning algorithm jointly infers item relevance (using a BERT model) and position bias (in our case, a single embedding parameter per rank), both by optimizing a **listwise softmax cross-entropy loss**. For more info, [read our paper](https://arxiv.org/abs/2404.02543) and [find the code for this model here](https://github.com/philipphager/baidu-bert-model).
18
 
19
  ## Test Results on Baidu-ULTR Expert Annotations
20