Datasets:
license: apache-2.0
language:
- el
pipeline_tag: summarization
task_categories:
- summarization
- text-generation
- text2text-generation
tags:
- GreekNLP
- Text Summarization
- Text Generation
- Title Generation
- Greek
- Wikipedia
pretty_name: Greek Wikipedia
size_categories:
- 10K<n<100K
GreekWikipedia
A Greek abstractive summarization dataset collected from the Greek part of Wikipedia, which contains 93,433 articles, their titles and summaries.
This dataset has been used to train our best-performing model GreekT5-umt5-base-greekwikipedia as part of our research article:
Giarelis, N., Mastrokostas, C., & Karacapilidis, N. (2024) Greek Wikipedia: A Study on Abstractive Summarization
For information about dataset creation, limitations etc. see the original article.
Supported Tasks and Leaderboards
This dataset supports:
Text summarization: Given the text of an article, a text generation model learns to generate an abstractive summary.
Title Generation: Given the text of an article, a text generation model learns to generate a post title.
Languages
All articles are written in Greek.
Dataset Structure
Data Instances
The dataset is structured as a .csv
file, while three dataset splits are provided (train, validation and test).
Data Fields
The following data fields are provided for each split:
title
: (str) A short title.article
: (str) The full text of the article.summary
: (str): The abstractive summary of the article.url
: (str) The URL which links to the original unprocessed article.
Data Splits
Split | No of Documents |
---|---|
Train | 83,433 |
Validation | 5000 |
Test | 5000 |
Example code
from datasets import load_dataset
# Load the training, validation and test dataset splits.
train_split = load_dataset('IMISLab/GreekWikipedia', split = 'train')
validation_split = load_dataset('IMISLab/GreekWikipedia', split = 'validation')
test_split = load_dataset('IMISLab/GreekWikipedia', split = 'test')
print(test_split[0])
Contact
If you have any questions/feedback about the dataset please e-mail one of the following authors:
[email protected]
[email protected]
[email protected]
Citation
TBA