Transformers
Safetensors
bert
Inference Endpoints
philipphager commited on
Commit
2cb7faf
1 Parent(s): a844455

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -0
README.md CHANGED
@@ -11,6 +11,12 @@ metrics:
11
  - dcg@10
12
  - ndcg@10
13
  - mrr@10
 
 
 
 
 
 
14
  ---
15
 
16
  # Two Tower MonoBERT trained on Baidu-ULTR
 
11
  - dcg@10
12
  - ndcg@10
13
  - mrr@10
14
+ co2_eq_emissions:
15
+ emissions: 2090
16
+ source: "Calculated using the [ML CO2 impact calculator](https://mlco2.github.io/impact/#compute), training for 4 x 45 hours with a carbon efficiency of 0.029 kg/kWh. You can inspect the carbon efficiency of the French national grid provider here: https://www.rte-france.com/eco2mix/les-emissions-de-co2-par-kwh-produit-en-france"
17
+ training_type: "Pre-training"
18
+ geographical_location: "Grenoble, France"
19
+ hardware_used: "4 NVIDIA H100-80GB GPUs"
20
  ---
21
 
22
  # Two Tower MonoBERT trained on Baidu-ULTR