Datasets:

Modalities:
Tabular
Text
Formats:
parquet
Languages:
English
ArXiv:
DOI:
Libraries:
Datasets
Dask
License:

Dataset size of 3.8TB instead of 8.8TB?

#9
by richardaecn - opened

I downloaded all parquet files from fineweb-edu/data, but the whole dataset is only 3.8TB instead of 8.8TB as listed in the main page.

I also counted the number of folders and files and found 1630 parquet files in 95 folders. If we do a rough calculation, each parquet file is about 2.3GB, so we have 2.3 x 1630 / 1024 = 3.7TB, which matches my downloaded size.

Am I missing any files?
Screenshot_2024-06-25_16-11-02.png

One is counting the whole git repo. One is counting just the data directory within the repo.

Here is what I did:

$ git lfs install
$ git clone https://huggingface.co/datasets/HuggingFaceFW/fineweb-edu
$ cd fineweb-edu
$ du -sh .
9.3T    .
$ du -sh .git *
4.7T    .git
3.5T    data
24K     README.md
1.2T    sample

Since .git directory holds the history, so it is best clone the repo without history.

git clone --depth 1 https://huggingface.co/datasets/HuggingFaceFW/fineweb-edu

Thanks! I only downloaded \data folder so that's why.

loubnabnl changed discussion status to closed

Sign up or log in to comment