Transformers
Safetensors
bert
Inference Endpoints
philipphager commited on
Commit
44584ab
1 Parent(s): ed519da

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -8
README.md CHANGED
@@ -18,14 +18,15 @@ A flax-based MonoBERT cross encoder trained on the [Baidu-ULTR](https://arxiv.or
18
 
19
  ## Test Results on Baidu-ULTR Expert Annotations
20
 
21
- | Model | log-likelihood | DCG@1 | DCG@3 | DCG@5 | DCG@10 | nDCG@10 | MRR@10 |
22
- |----------------------|----------------|----------|----------|----------|----------|----------|---------|
23
- | Pointwise Naive | 0.2268 | 1.6411 | 3.4624 | 4.7520 | 7.2506 | 0.3567 | 0.6089 |
24
- | Pointwise IPS | 0.2216 | 1.2948 | 2.8106 | 3.9767 | 6.2961 | 0.3075 | 0.5343 |
25
- | Pointwise Two Tower | 0.2176 | 1.6288 | 3.4712 | 4.8220 | 7.4556 | 0.3668 | 0.6071 |
26
- | **Listwise Naive** | - | **1.9738** | **4.1609** | **5.6861** | **8.5432** | **0.4091** | **0.6436** |
27
- | Listwise IPS | - | 1.7466 | 3.6378 | 4.9797 | 7.5790 | 0.3665 | 0.6112 |
28
- | Listwise DLA | - | 1.7954 | 3.8054 | 5.2083 | 7.9342 | 0.3848 | 0.6261 |
 
29
 
30
 
31
  ## Usage
 
18
 
19
  ## Test Results on Baidu-ULTR Expert Annotations
20
 
21
+ | Model | Log-likelihood | DCG@1 | DCG@3 | DCG@5 | DCG@10 | nDCG@10 | MRR@10 |
22
+ |---------------------|----------------|-------|-------|-------|--------|---------|--------|
23
+ | Pointwise Naive | 0.227 | 1.641 | 3.462 | 4.752 | 7.251 | 0.357 | 0.609 |
24
+ | Pointwise Two-Tower | 0.218 | 1.629 | 3.471 | 4.822 | 7.456 | 0.367 | 0.607 |
25
+ | Pointwise IPS | 0.222 | 1.295 | 2.811 | 3.977 | 6.296 | 0.307 | 0.534 |
26
+ | Listwise Naive | - | 1.947 | 4.108 | 5.614 | 8.478 | 0.405 | 0.639 |
27
+ | Listwise IPS | - | 1.671 | 3.530 | 4.873 | 7.450 | 0.361 | 0.603 |
28
+ | Listwise DLA | - | 1.796 | 3.730 | 5.125 | 7.802 | 0.377 | 0.615 |
29
+
30
 
31
 
32
  ## Usage