VictorSanh
commited on
Commit
•
761c5ab
1
Parent(s):
9a6ed1e
Update readme and doc from the 80b repo
Browse files
README.md
CHANGED
@@ -4,6 +4,7 @@ tags:
|
|
4 |
- multimodal
|
5 |
- text
|
6 |
- image
|
|
|
7 |
license: other
|
8 |
datasets:
|
9 |
- HuggingFaceM4/OBELICS
|
@@ -305,9 +306,9 @@ Similarly to the base IDEFICS models, we performed checkpoint selection to stop
|
|
305 |
|
306 |
## Hardware
|
307 |
|
308 |
-
The IDEFICS models were trained on an AWS SageMaker cluster with 8x80GB A100 GPUs nodes and EFA network.
|
309 |
|
310 |
-
- IDEFICS-80B took ~28 days of training on 64 nodes (512 GPUs).
|
311 |
- IDEFICS-80b-instruct finetuned the base model for ~3 days on 48 nodes (384 GPUs).
|
312 |
|
313 |
|
@@ -316,6 +317,47 @@ The IDEFICS models were trained on an AWS SageMaker cluster with 8x80GB A100 GPU
|
|
316 |
The training software is built on top of HuggingFace Transformers + Accelerate, and [DeepSpeed ZeRO-3](https://github.com/microsoft/DeepSpeed) for training, and [WebDataset](https://github.com/webdataset/webdataset) for data loading.
|
317 |
|
318 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
319 |
# Bias, Risks, and Limitations
|
320 |
|
321 |
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
|
@@ -438,4 +480,4 @@ Stas Bekman*, Léo Tronchon*, Hugo Laurençon*, Lucile Saulnier*, Amanpreet Sing
|
|
438 |
|
439 |
# Model Card Contact
|
440 |
|
441 |
-
Please open a discussion on the Community tab!
|
|
|
4 |
- multimodal
|
5 |
- text
|
6 |
- image
|
7 |
+
- image-to-text
|
8 |
license: other
|
9 |
datasets:
|
10 |
- HuggingFaceM4/OBELICS
|
|
|
306 |
|
307 |
## Hardware
|
308 |
|
309 |
+
The IDEFICS models were trained on an AWS SageMaker cluster with 8x80GB A100 GPUs nodes and EFA network.
|
310 |
|
311 |
+
- IDEFICS-80B took ~28 days of training on 64 nodes (512 GPUs).
|
312 |
- IDEFICS-80b-instruct finetuned the base model for ~3 days on 48 nodes (384 GPUs).
|
313 |
|
314 |
|
|
|
317 |
The training software is built on top of HuggingFace Transformers + Accelerate, and [DeepSpeed ZeRO-3](https://github.com/microsoft/DeepSpeed) for training, and [WebDataset](https://github.com/webdataset/webdataset) for data loading.
|
318 |
|
319 |
|
320 |
+
## Environmental Impact
|
321 |
+
|
322 |
+
We distinguish the 3 phases of the creation of IDEFICS and report our carbon emissions separately for each one of them:
|
323 |
+
|
324 |
+
*Preliminary experimentation*
|
325 |
+
- **Hardware Type:** Intel Cascade Lake CPUs, NVIDIA V100 and A100 GPUs
|
326 |
+
- **Hours used:** 460,000 CPU hours, 385,000 V100 GPU hours, and 300,000 A100 GPU hours
|
327 |
+
- **Cloud Provider:** N/A (Jean Zay cluster)
|
328 |
+
- **Compute Region:** France (57g CO2eq/kWh)
|
329 |
+
- **Carbon Emitted:** 16,714 kgs of CO2eq
|
330 |
+
|
331 |
+
*IDEFICS-9b pretraining*
|
332 |
+
- **Hardware Type:** 128 NVIDIA A100 GPUs
|
333 |
+
- **Hours used:** 350 hours
|
334 |
+
- **Cloud Provider:** AWS
|
335 |
+
- **Compute Region:** US-West 2 (288g CO2eq/kWh)
|
336 |
+
- **Carbon Emitted:** 5,160 kg of CO2eq
|
337 |
+
|
338 |
+
*IDEFICS-9b-instruct finetuning*
|
339 |
+
- **Hardware Type:** 128 NVIDIA A100 GPUs
|
340 |
+
- **Hours used:** 70 hours
|
341 |
+
- **Cloud Provider:** AWS
|
342 |
+
- **Compute Region:** US-West 2 (288g CO2eq/kWh)
|
343 |
+
- **Carbon Emitted:** 1,032 kg of CO2eq
|
344 |
+
|
345 |
+
*IDEFICS-80b pretraining*
|
346 |
+
- **Hardware Type:** 512 NVIDIA A100 GPUs
|
347 |
+
- **Hours used:** 672 hours (28 days)
|
348 |
+
- **Cloud Provider:** AWS
|
349 |
+
- **Compute Region:** US-West 2 (288g CO2eq/kWh)
|
350 |
+
- **Carbon Emitted:** 39,498 kg of CO2eq
|
351 |
+
|
352 |
+
*IDEFICS-80b-instruct finetuning*
|
353 |
+
- **Hardware Type:** 384 NVIDIA A100 GPUs
|
354 |
+
- **Hours used:** 72 hours (3 days)
|
355 |
+
- **Cloud Provider:** AWS
|
356 |
+
- **Compute Region:** US-West 2 (288g CO2eq/kWh)
|
357 |
+
- **Carbon Emitted:** 3,174 kg of CO2eq
|
358 |
+
|
359 |
+
This means that the total carbon footprint of the entire IDEFICS project can be estimated at **65.57 tons of CO2eq**, which is roughly equal to 168,092 miles driven by an average gasoline-powered car or 8.3 homes' energy use for one year, according to the [US Environmental Protection Agency](https://www.epa.gov/energy/greenhouse-gas-equivalencies-calculator).
|
360 |
+
|
361 |
# Bias, Risks, and Limitations
|
362 |
|
363 |
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
|
|
|
480 |
|
481 |
# Model Card Contact
|
482 |
|
483 |
+
Please open a discussion on the Community tab!
|