Files changed (1) hide show
  1. README.md +8 -2
README.md CHANGED
@@ -52,13 +52,12 @@ language:
52
  pipeline_tag: text-generation
53
  ---
54
 
55
- # <span style="color:red"><b>WARNING:</b> Intermediary checkpoint at global step 91100. This checkpoint is not a fully trained model. Evaluations of intermediary checkpoints and the final model will be added when conducted (see below).</span>
56
 
57
  # <p>BLOOM LM<br/> _BigScience Large Open-science Open-access Multilingual Language Model_ <br/>Model Card</p>
58
  <img src="https://assets.website-files.com/6139f3cdcbbff3a68486761d/613cd8997b270da063e230c5_Tekengebied%201-p-500.png" alt="BigScience Logo" width="200"/>
59
 
60
 
61
- Version 1.3 / 3.July.2022 - Checkpoint: **Global step 91100**
62
 
63
  # Table of Contents
64
  1. [Model Details](#model-details)
@@ -140,6 +139,7 @@ Please see [the BLOOM training README](https://github.com/bigscience-workshop/bi
140
  * Sequence length of 2048 tokens used (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization))
141
 
142
  **Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)).
 
143
 
144
  ### Compute infrastructure
145
  Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)).
@@ -372,6 +372,11 @@ Intentionally using the model for harm, violating [human rights](#human-rights),
372
 
373
  - Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license)
374
 
 
 
 
 
 
375
  ## Intended Users
376
 
377
  ### Direct Users
@@ -408,6 +413,7 @@ Intentionally using the model for harm, violating [human rights](#human-rights),
408
 
409
  ---
410
 
 
411
  # Risks and Limitations
412
  *This section identifies foreseeable harms and misunderstandings.*
413
 
 
52
  pipeline_tag: text-generation
53
  ---
54
 
 
55
 
56
  # <p>BLOOM LM<br/> _BigScience Large Open-science Open-access Multilingual Language Model_ <br/>Model Card</p>
57
  <img src="https://assets.website-files.com/6139f3cdcbbff3a68486761d/613cd8997b270da063e230c5_Tekengebied%201-p-500.png" alt="BigScience Logo" width="200"/>
58
 
59
 
60
+ Version 1.3 / 3.July.2022 - Checkpoint: **Global step 95000** - Number of seen tokens: **398B seen tokens**
61
 
62
  # Table of Contents
63
  1. [Model Details](#model-details)
 
139
  * Sequence length of 2048 tokens used (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization))
140
 
141
  **Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)).
142
+
143
 
144
  ### Compute infrastructure
145
  Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)).
 
372
 
373
  - Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license)
374
 
375
+ ## Intermediate checkpoints
376
+
377
+ For academic (or any) usage, we published the intermediate checkpoints, corresponding to the model state at each 5000 steps. Please follow [this link](https://huggingface.co/bigscience/bloom-176-intermediate) to get these checkpoints.
378
+
379
+
380
  ## Intended Users
381
 
382
  ### Direct Users
 
413
 
414
  ---
415
 
416
+
417
  # Risks and Limitations
418
  *This section identifies foreseeable harms and misunderstandings.*
419