Datasets:

Size:
n>1T
ArXiv:
License:

Deduplicated English Corpus

#3
by conceptofmind - opened

Hi all,

Thank you for your amazing work.

I had a few questions:

  1. I was wondering what the best way to access the deduplicated English corpus would be? I only see an option to select "en".
  2. Is the data available just a single Common Crawl snapshot from the year 2022? Does this include all of the cumulative data from the entire year?

For example, does it contain all the data from each of these snapshots in a single year:

s3://commoncrawl/crawl-data/CC-MAIN-2022-05 – January 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-21 – May 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-27 – June/July 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-33 – August 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-40 – September/October 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-49 – November/December 2022

Thank you,

Enrico

OSCAR org

Hello Enrico,

I already replied to your question on the Discord but I'm pasting the answer here for completeness:

We only have versions of OSCAR for 4 snapshots of CC so far November 2018 (OSCAR 2019), February/March 2021 (OSCAR 21.09), November/December 2021 (OSCAR 22.01), November/December 2022 (OSCAR 23.01)
Only OSCAR 2019 and OSCAR 21.09 are deduplicated by line
OSCAR 23.01 introduces LSH for documents, but you have to do de deduplication yourself

Also note that Common Crawl uses a polite crawler, so normally the crawls are sampled, meaning that there's very little overlap between 2 contiguous crawls. You normally start seeing significant overlap when you use more than 10 crawls.

Hope this helps!

All the best,
Pedro

Hi all,

Thank you for your amazing work.

I would like to know that when I use:
from datasets import load_dataset
dataset = load_dataset("oscar-corpus/OSCAR-2301",
cache_dir='/dataset/nlp/oscar/cache',
language="zh")
I only got 385GB size file, but the dataset card say the chinese(zh) has 1.4T data,
So it is the data has been cleaned or deduplication to some extent?

uj changed discussion status to closed

Sign up or log in to comment