Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
---
|
2 |
license: odc-by
|
3 |
-
viewer:
|
4 |
task_categories:
|
5 |
- text-generation
|
6 |
language:
|
@@ -39,7 +39,7 @@ At the moment, there are six versions of Dolma available:
|
|
39 |
|
40 |
| **Version** | **Default?** | **Release Date** | **Size** (gzip) | **Description** |
|
41 |
|--|:--:|--|--|--|
|
42 |
-
| `v1_7` | ✅ | 2024-04-15 |
|
43 |
| `v1_6` | | 2024-01-31 | 5.4 TB | An update to v1.5 with some bug-fixes. |
|
44 |
| `v1_6-sample` | | 2024-01-31 | 16.4 GB | A smaller sample of Dolma, with roughly 10 billion tokens. Useful for data exploration. |
|
45 |
| `v1_5` | | 2023-10-31 | 6.4 TB | The version of Dolma used to train [OLMo-1B](https://huggingface.co/allenai/OLMo-1B). Roughly 3 trillion tokens. |
|
@@ -51,20 +51,20 @@ At the moment, there are six versions of Dolma available:
|
|
51 |
|
52 |
| **Source** | **Provenance** | **New?** | **Documents** (millions) | **OLMo tokens** (billions) | **Sample Proportion** | **Cutoff Date** | **Processing**
|
53 |
|--|--|--|--|--|--|--|--|
|
54 |
-
| Dolma's CC | [Common Crawl](https://commoncrawl.org/) via Dolma v1.6 | Updated |
|
55 |
-
| Refined Web | [Refined Web](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | Yes |
|
56 |
-
| StarCoder | [StarCoder](https://huggingface.co/blog/starcoder) | Yes |
|
57 |
-
| C4 | [C4](https://huggingface.co/datasets/c4) via Dolma v1.6 | Updated |
|
58 |
-
| Reddit | [PushShift API](https://github.com/pushshift/api) | Updated |
|
59 |
| Semantic Scholar ([S2ORC](https://aclanthology.org/2020.acl-main.447/) & [S2AG](https://www.semanticscholar.org/product/api)) | [peS2o](https://huggingface.co/datasets/allenai/peS2o) via Dolma v1.6 | No | 38.8 | 57.2 | 100% | Mar 2023 | Same as Dolma v1.6 |
|
60 |
-
| arXiv | [RedPajama v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | Yes |
|
61 |
-
| StackExchange | [RedPajama v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | Yes |
|
62 |
-
| Flan | [Flan](https://arxiv.org/abs/2301.13688) via [Tulu](https://huggingface.co/datasets/allenai/tulu-v2-sft-mixture) | Yes |
|
63 |
-
| CC News | [Common Crawl](https://commoncrawl.org/blog/news-dataset-available) | Yes |
|
64 |
-
| OpenWebMath | [OpenWebMath](https://huggingface.co/datasets/open-web-math/open-web-math) via [Proof Pile II](https://huggingface.co/datasets/EleutherAI/proof-pile-2) | Yes |
|
65 |
-
| Algebraic Stack | [Proof Pile II](https://huggingface.co/datasets/EleutherAI/proof-pile-2) | Yes |
|
66 |
| Project Gutenberg | [Project Gutenberg](https://www.gutenberg.org) via Dolma v1.6 | No | 0.0556 | 5.3 | 100% | Mar 2023 | Same as Dolma v1.6 |
|
67 |
-
| MegaWika | [MetaWika](https://huggingface.co/datasets/hltcoe/megawika) | Yes |
|
68 |
| Wikipedia & Wikibooks | [Wikimedia](https://dumps.wikimedia.org) via Dolma v1.6 | No | 6.2 | 3.7 | 200% | Mar 2023 | Same as Dolma v1.6 |
|
69 |
| **Total** | | | | **2,308.5** | **1,715.1** | | |
|
70 |
|
@@ -140,4 +140,3 @@ If you use our dataset or tooling, please cite us at:
|
|
140 |
journal={arXiv preprint},
|
141 |
}
|
142 |
```
|
143 |
-
|
|
|
1 |
---
|
2 |
license: odc-by
|
3 |
+
viewer: false
|
4 |
task_categories:
|
5 |
- text-generation
|
6 |
language:
|
|
|
39 |
|
40 |
| **Version** | **Default?** | **Release Date** | **Size** (gzip) | **Description** |
|
41 |
|--|:--:|--|--|--|
|
42 |
+
| `v1_7` | ✅ | 2024-04-15 | 4.5 TB | Used to train [OLMo-7B-v1.7](https://huggingface.co/allenai/OLMo-7b-v1.7). |
|
43 |
| `v1_6` | | 2024-01-31 | 5.4 TB | An update to v1.5 with some bug-fixes. |
|
44 |
| `v1_6-sample` | | 2024-01-31 | 16.4 GB | A smaller sample of Dolma, with roughly 10 billion tokens. Useful for data exploration. |
|
45 |
| `v1_5` | | 2023-10-31 | 6.4 TB | The version of Dolma used to train [OLMo-1B](https://huggingface.co/allenai/OLMo-1B). Roughly 3 trillion tokens. |
|
|
|
51 |
|
52 |
| **Source** | **Provenance** | **New?** | **Documents** (millions) | **OLMo tokens** (billions) | **Sample Proportion** | **Cutoff Date** | **Processing**
|
53 |
|--|--|--|--|--|--|--|--|
|
54 |
+
| Dolma's CC | [Common Crawl](https://commoncrawl.org/) via Dolma v1.6 | Updated | 875.2 | 1,195.5 | 50% | Mar 2023 | Extracted using the Dolma pipeline; new quality filtering and deduplication steps. |
|
55 |
+
| Refined Web | [Refined Web](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | Yes | 664.0 | 456.4 | 100% | Feb 2023 | Filtered using the Dolma pipeline; new quality filtering and deduplication steps. |
|
56 |
+
| StarCoder | [StarCoder](https://huggingface.co/blog/starcoder) | Yes | 206.6 | 263.8 | 100% | May 2023 | No further processing. |
|
57 |
+
| C4 | [C4](https://huggingface.co/datasets/c4) via Dolma v1.6 | Updated | 249.9 | 138.4 | 50% | Apr 2019 | Filtered using the Dolma pipeline; new quality filtering and deduplication steps. |
|
58 |
+
| Reddit | [PushShift API](https://github.com/pushshift/api) | Updated | 377.4 | 79.9 | 100% | Mar 2023 | Extracted using the Dolma pipeline; new quality filtering and deduplication steps. |
|
59 |
| Semantic Scholar ([S2ORC](https://aclanthology.org/2020.acl-main.447/) & [S2AG](https://www.semanticscholar.org/product/api)) | [peS2o](https://huggingface.co/datasets/allenai/peS2o) via Dolma v1.6 | No | 38.8 | 57.2 | 100% | Mar 2023 | Same as Dolma v1.6 |
|
60 |
+
| arXiv | [RedPajama v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | Yes | 1.5 | 28.0 | 100% | Mar 2023 | No further processing. |
|
61 |
+
| StackExchange | [RedPajama v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | Yes | 29.3 | 19.6 | 100% | Mar 2023 | No further processing. |
|
62 |
+
| Flan | [Flan](https://arxiv.org/abs/2301.13688) via [Tulu](https://huggingface.co/datasets/allenai/tulu-v2-sft-mixture) | Yes | 52.1 | 16.5 | 100% | Mar 2023 | |
|
63 |
+
| CC News | [Common Crawl](https://commoncrawl.org/blog/news-dataset-available) | Yes | 22.0 | 14.3 | 100% | Mar 2023 | Extracted using the Dolma pipeline; new quality filtering and deduplication steps. |
|
64 |
+
| OpenWebMath | [OpenWebMath](https://huggingface.co/datasets/open-web-math/open-web-math) via [Proof Pile II](https://huggingface.co/datasets/EleutherAI/proof-pile-2) | Yes | 2.9 | 12.6 | 100% | Oct 2023 | Training subset; no further processing. |
|
65 |
+
| Algebraic Stack | [Proof Pile II](https://huggingface.co/datasets/EleutherAI/proof-pile-2) | Yes | 2.8 | 12.6 | 100% | Oct 2023 | Training subset; no further processing. |
|
66 |
| Project Gutenberg | [Project Gutenberg](https://www.gutenberg.org) via Dolma v1.6 | No | 0.0556 | 5.3 | 100% | Mar 2023 | Same as Dolma v1.6 |
|
67 |
+
| MegaWika | [MetaWika](https://huggingface.co/datasets/hltcoe/megawika) | Yes | 3.2 | 4.6 | 100% | Jul 2023 | English web pages cited from Wikipedia; curated using the full Dolma pipeline. |
|
68 |
| Wikipedia & Wikibooks | [Wikimedia](https://dumps.wikimedia.org) via Dolma v1.6 | No | 6.2 | 3.7 | 200% | Mar 2023 | Same as Dolma v1.6 |
|
69 |
| **Total** | | | | **2,308.5** | **1,715.1** | | |
|
70 |
|
|
|
140 |
journal={arXiv preprint},
|
141 |
}
|
142 |
```
|
|