Convert dataset sizes from base 2 to base 10 in the dataset card
#6
by
albertvillanova
HF staff
- opened
README.md
CHANGED
@@ -121,9 +121,9 @@ dataset_info:
|
|
121 |
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
122 |
- **Paper:** [Pointer Sentinel Mixture Models](https://arxiv.org/abs/1609.07843)
|
123 |
- **Point of Contact:** [Stephen Merity](mailto:[email protected])
|
124 |
-
- **Size of downloaded dataset files:**
|
125 |
-
- **Size of the generated dataset:**
|
126 |
-
- **Total amount of disk used:**
|
127 |
|
128 |
### Dataset Summary
|
129 |
|
@@ -149,9 +149,9 @@ that can take advantage of long term dependencies.
|
|
149 |
|
150 |
#### wikitext-103-raw-v1
|
151 |
|
152 |
-
- **Size of downloaded dataset files:**
|
153 |
-
- **Size of the generated dataset:**
|
154 |
-
- **Total amount of disk used:**
|
155 |
|
156 |
An example of 'validation' looks as follows.
|
157 |
```
|
@@ -164,9 +164,9 @@ This example was too long and was cropped:
|
|
164 |
|
165 |
#### wikitext-103-v1
|
166 |
|
167 |
-
- **Size of downloaded dataset files:**
|
168 |
-
- **Size of the generated dataset:**
|
169 |
-
- **Total amount of disk used:**
|
170 |
|
171 |
An example of 'train' looks as follows.
|
172 |
```
|
@@ -179,9 +179,9 @@ This example was too long and was cropped:
|
|
179 |
|
180 |
#### wikitext-2-raw-v1
|
181 |
|
182 |
-
- **Size of downloaded dataset files:** 4.
|
183 |
-
- **Size of the generated dataset:**
|
184 |
-
- **Total amount of disk used:**
|
185 |
|
186 |
An example of 'train' looks as follows.
|
187 |
```
|
@@ -194,9 +194,9 @@ This example was too long and was cropped:
|
|
194 |
|
195 |
#### wikitext-2-v1
|
196 |
|
197 |
-
- **Size of downloaded dataset files:** 4.
|
198 |
-
- **Size of the generated dataset:**
|
199 |
-
- **Total amount of disk used:**
|
200 |
|
201 |
An example of 'train' looks as follows.
|
202 |
```
|
|
|
121 |
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
122 |
- **Paper:** [Pointer Sentinel Mixture Models](https://arxiv.org/abs/1609.07843)
|
123 |
- **Point of Contact:** [Stephen Merity](mailto:[email protected])
|
124 |
+
- **Size of downloaded dataset files:** 391.41 MB
|
125 |
+
- **Size of the generated dataset:** 1.12 GB
|
126 |
+
- **Total amount of disk used:** 1.52 GB
|
127 |
|
128 |
### Dataset Summary
|
129 |
|
|
|
149 |
|
150 |
#### wikitext-103-raw-v1
|
151 |
|
152 |
+
- **Size of downloaded dataset files:** 191.98 MB
|
153 |
+
- **Size of the generated dataset:** 549.42 MB
|
154 |
+
- **Total amount of disk used:** 741.41 MB
|
155 |
|
156 |
An example of 'validation' looks as follows.
|
157 |
```
|
|
|
164 |
|
165 |
#### wikitext-103-v1
|
166 |
|
167 |
+
- **Size of downloaded dataset files:** 190.23 MB
|
168 |
+
- **Size of the generated dataset:** 548.05 MB
|
169 |
+
- **Total amount of disk used:** 738.27 MB
|
170 |
|
171 |
An example of 'train' looks as follows.
|
172 |
```
|
|
|
179 |
|
180 |
#### wikitext-2-raw-v1
|
181 |
|
182 |
+
- **Size of downloaded dataset files:** 4.72 MB
|
183 |
+
- **Size of the generated dataset:** 13.54 MB
|
184 |
+
- **Total amount of disk used:** 18.26 MB
|
185 |
|
186 |
An example of 'train' looks as follows.
|
187 |
```
|
|
|
194 |
|
195 |
#### wikitext-2-v1
|
196 |
|
197 |
+
- **Size of downloaded dataset files:** 4.48 MB
|
198 |
+
- **Size of the generated dataset:** 13.34 MB
|
199 |
+
- **Total amount of disk used:** 17.82 MB
|
200 |
|
201 |
An example of 'train' looks as follows.
|
202 |
```
|