matthieumeeus97
commited on
Commit
•
243613e
1
Parent(s):
54ea011
Update README.md
Browse files
README.md
CHANGED
@@ -18,10 +18,6 @@ This dataset contains **full** ArXiv papers randomly sampled from the train (mem
|
|
18 |
As such, the dataset can be used to develop and evaluate document-level MIAs against LLMs trained on The Pile.
|
19 |
|
20 |
We randomly sample 1,000 documents from the train set (members) and 1,000 documents from the test set (non-members), ensuring that the selected documents have at least 5,000 words (any sequences of characters seperated by a white space).
|
|
|
21 |
|
22 |
-
Target models include the suite of Pythia and GPTNeo models, to be found [here](https://huggingface.co/EleutherAI).
|
23 |
-
|
24 |
-
Note: our understanding is that the deduplication executed on the Pile to create the "Pythia-dedup" models has been only done on the training dataset, suggesting this dataset of members/non-members also to be valid for these models.
|
25 |
-
|
26 |
-
|
27 |
-
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
|
|
18 |
As such, the dataset can be used to develop and evaluate document-level MIAs against LLMs trained on The Pile.
|
19 |
|
20 |
We randomly sample 1,000 documents from the train set (members) and 1,000 documents from the test set (non-members), ensuring that the selected documents have at least 5,000 words (any sequences of characters seperated by a white space).
|
21 |
+
We also provide the dataset where each document is split into 25 sequences of 200 words [here](https://huggingface.co/datasets/imperial-cpg/pile_arxiv_doc_mia_sequences)
|
22 |
|
23 |
+
Target models include the suite of Pythia and GPTNeo models, to be found [here](https://huggingface.co/EleutherAI). Our understanding is that the deduplication executed on the Pile to create the "Pythia-dedup" models has been only done on the training dataset, suggesting this dataset of members/non-members also to be valid for these models.
|
|
|
|
|
|
|
|
|
|