|
--- |
|
task_categories: |
|
- text-generation |
|
language: |
|
- zh |
|
tags: |
|
- 'llm ' |
|
- casual-lm |
|
- language-modeling |
|
pretty_name: SkyPile-150B |
|
size_categories: |
|
- 100B<n<1T |
|
--- |
|
|
|
# SkyPile-150B |
|
|
|
## Dataset Summary |
|
SkyPile-150B is a comprehensive, large-scale Chinese dataset specifically designed for the pre-training of large language models. It is derived from a broad array of publicly accessible Chinese Internet web pages. Rigorous filtering, extensive deduplication, and thorough sensitive data filtering have been employed to ensure its quality. Furthermore, we have utilized advanced tools such as fastText and BERT to filter out low-quality data. |
|
|
|
The publicly accessible portion of the SkyPile-150B dataset encompasses approximately 233 million unique web pages, each containing an average of over 1,000 Chinese characters. In total, the dataset includes approximately 150 billion tokens and 620 gigabytes of plain text data. |
|
|
|
|
|
## Language |
|
The SkyPile-150B dataset is exclusively composed of Chinese data. |
|
|
|
|
|
## Data Field Explanation |
|
- text: the processed and cleaned text extracted from each page. |
|
|
|
## Dataset Safety |
|
We utilized more than 200w rules and the BERT-base model to determine the sensitive data present in the dataset, and subsequently removed any harmful entries we detect. |
|
|
|
## Sensitive Information and Bias |
|
Despite our best efforts, SkyPile-150B, given its construction from publicly available web pages, might contain sensitive information such as email addresses, phone numbers, or IP addresses. We have endeavored to minimize this through deduplication and low-quality filtering, but users of SkyPile-150B should remain vigilant. |
|
|
|
The Internet is rife with potentially toxic or biased data. We have attempted to mitigate this with specific URL filtering methods, but we encourage users to remain conscious of this potential issue. |
|
|
|
## Social Impact of the Dataset |
|
The open-source release of the SkyPile-150B dataset represents our commitment to enhancing access to high-quality web data, which has traditionally been a closely guarded resource among model developers. We believe that this release will foster greater accessibility and the proliferation of high-performance large language models, thereby contributing significantly to the advancement of the field. |
|
|
|
## License Agreement |
|
The community usage of SkyPile dataset requires Skywork Community License. The SkyPile dataset supports commercial use. If you plan to use the Skywork model or its derivatives for commercial purposes, you must abide by terms and conditions within Skywork Community License as well as Apache2.0. |
|
|
|
## Contact Us and Citation |
|
If you find our work helpful, please feel free to cite our paper~ |
|
``` |
|
@misc{wei2023skywork, |
|
title={Skywork: A More Open Bilingual Foundation Model}, |
|
author={Tianwen Wei and Liang Zhao and Lichang Zhang and Bo Zhu and Lijie Wang and Haihua Yang and Biye Li and Cheng Cheng and Weiwei Lü and Rui Hu and Chenxia Li and Liu Yang and Xilin Luo and Xuejie Wu and Lunan Liu and Wenjun Cheng and Peng Cheng and Jianhao Zhang and Xiaoyu Zhang and Lei Lin and Xiaokun Wang and Yutuan Ma and Chuanhai Dong and Yanqi Sun and Yifu Chen and Yongyi Peng and Xiaojuan Liang and Shuicheng Yan and Han Fang and Yahui Zhou}, |
|
year={2023}, |
|
eprint={2310.19341}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.CL} |
|
} |
|
``` |
|
|