Update README.md
Browse files
README.md
CHANGED
@@ -56,7 +56,7 @@ thumbnail: https://gsarti.com/publication/it5/featured.png
|
|
56 |
---
|
57 |
# IT5 Small for Wikipedia Summarization ✂️📑 🇮🇹
|
58 |
|
59 |
-
This repository contains the checkpoint for the [IT5 Small](https://huggingface.co/gsarti/it5-small) model fine-tuned on Wikipedia summarization on the [WITS](https://www.semanticscholar.org/paper/WITS%3A-Wikipedia-for-Italian-Text-Summarization-Casola-Lavelli/ad6c83122e721c7c0db4a40727dac3b4762cd2b1) dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
|
60 |
|
61 |
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
|
62 |
|
@@ -85,10 +85,11 @@ If you use this model in your research, please cite our work as:
|
|
85 |
|
86 |
```bibtex
|
87 |
@article{sarti-nissim-2022-it5,
|
88 |
-
title={IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
|
89 |
author={Sarti, Gabriele and Nissim, Malvina},
|
90 |
-
journal={ArXiv preprint
|
91 |
-
url={
|
92 |
-
year={2022}
|
|
|
93 |
}
|
94 |
```
|
|
|
56 |
---
|
57 |
# IT5 Small for Wikipedia Summarization ✂️📑 🇮🇹
|
58 |
|
59 |
+
This repository contains the checkpoint for the [IT5 Small](https://huggingface.co/gsarti/it5-small) model fine-tuned on Wikipedia summarization on the [WITS](https://www.semanticscholar.org/paper/WITS%3A-Wikipedia-for-Italian-Text-Summarization-Casola-Lavelli/ad6c83122e721c7c0db4a40727dac3b4762cd2b1) dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
|
60 |
|
61 |
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
|
62 |
|
|
|
85 |
|
86 |
```bibtex
|
87 |
@article{sarti-nissim-2022-it5,
|
88 |
+
title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
|
89 |
author={Sarti, Gabriele and Nissim, Malvina},
|
90 |
+
journal={ArXiv preprint 2203.03759},
|
91 |
+
url={https://arxiv.org/abs/2203.03759},
|
92 |
+
year={2022},
|
93 |
+
month={mar}
|
94 |
}
|
95 |
```
|