Datasets:

Modalities:
Text
ArXiv:
Libraries:
Datasets

How much disk space would the whole HF dataset take?

#27
by protossw512 - opened

I am planning to download the whole dataset with something like:

load_dataset("togethercomputer/RedPajama-Data-V2", name="default", num_proc=16)

It's been quite a while and already consumed 32tb disk space, what is the total disk space to download the whole dataset? Given the 1T tokens takes around 5tb(https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T/discussions/15), the estimation is 150 tb for 30T tokens?

Together org

Hi @protossw512 , yes 150TB is roughly the disk space you will need for all the head/middle buckets and quality signals. For comparison, the full dataset which includes head, middle, tail partitions, quality signals and minhashes takes 270TB of disk space.

Sign up or log in to comment