Transformers
Safetensors
bert
Inference Endpoints
philipphager commited on
Commit
4d90931
1 Parent(s): deb2616

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -3
README.md CHANGED
@@ -23,7 +23,7 @@ A flax-based MonoBERT cross encoder trained on the [Baidu-ULTR](https://arxiv.or
23
  | **Naive Listwise** | - | 1.9738 | 4.1609 | 5.6861 | 8.5432 | 0.4091 | 0.6436 |
24
 
25
  ## Usage
26
- Here is an example with a mock input batch for how to download and call the model:
27
 
28
  ```Python
29
  import jax.numpy as jnp
@@ -67,8 +67,6 @@ outputs = model(batch, train=False)
67
  print(outputs)
68
  ```
69
 
70
- For more details on how to use the model with real data from Baidu-ULTR, take a look at the [evaluation script of our model repository](https://github.com/philipphager/baidu-bert-model/blob/main/eval.py).
71
-
72
  ## Reference
73
  ```
74
  @inproceedings{Hager2024BaiduULTR,
 
23
  | **Naive Listwise** | - | 1.9738 | 4.1609 | 5.6861 | 8.5432 | 0.4091 | 0.6436 |
24
 
25
  ## Usage
26
+ Here is an example of downloading the model and calling it for inference on a mock batch of input data. For more details on how to use the model on the Baidu-ULTR dataset, take a look at our [training](https://github.com/philipphager/baidu-bert-model/blob/main/main.py) and [evaluation scripts](https://github.com/philipphager/baidu-bert-model/blob/main/eval.py) in our code repository.
27
 
28
  ```Python
29
  import jax.numpy as jnp
 
67
  print(outputs)
68
  ```
69
 
 
 
70
  ## Reference
71
  ```
72
  @inproceedings{Hager2024BaiduULTR,