Transformers
Safetensors
bert
Inference Endpoints
philipphager commited on
Commit
40c0ec6
1 Parent(s): 7a1433c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -0
README.md CHANGED
@@ -11,6 +11,12 @@ metrics:
11
  - dcg@10
12
  - ndcg@10
13
  - mrr@10
 
 
 
 
 
 
14
  ---
15
 
16
  # Listwise MonoBERT trained on Baidu-ULTR with Inverse Propensity Scoring (IPS)
 
11
  - dcg@10
12
  - ndcg@10
13
  - mrr@10
14
+ co2_eq_emissions:
15
+ emissions: 2090
16
+ source: "Calculated using the [ML CO2 impact calculator](https://mlco2.github.io/impact/#compute), training for 4 x 45 hours with a carbon efficiency of 0.029 kg/kWh. You can inspect the carbon efficiency of the French national grid provider here: https://www.rte-france.com/eco2mix/les-emissions-de-co2-par-kwh-produit-en-france"
17
+ training_type: "Pre-training"
18
+ geographical_location: "Grenoble, France"
19
+ hardware_used: "4 NVIDIA H100-80GB GPUs"
20
  ---
21
 
22
  # Listwise MonoBERT trained on Baidu-ULTR with Inverse Propensity Scoring (IPS)