Update README.md
Browse files
README.md
CHANGED
@@ -57,7 +57,7 @@ thumbnail: https://gsarti.com/publication/it5/featured.png
|
|
57 |
---
|
58 |
# mT5 Small for News Headline Generation 📣 🇮🇹
|
59 |
|
60 |
-
This repository contains the checkpoint for the [mT5 Small](https://huggingface.co/google/mt5-small) model fine-tuned on news headline generation on the Italian HeadGen-IT dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
|
61 |
|
62 |
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
|
63 |
|
@@ -86,10 +86,11 @@ If you use this model in your research, please cite our work as:
|
|
86 |
|
87 |
```bibtex
|
88 |
@article{sarti-nissim-2022-it5,
|
89 |
-
title={IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
|
90 |
author={Sarti, Gabriele and Nissim, Malvina},
|
91 |
-
journal={ArXiv preprint
|
92 |
-
url={
|
93 |
-
year={2022}
|
|
|
94 |
}
|
95 |
```
|
|
|
57 |
---
|
58 |
# mT5 Small for News Headline Generation 📣 🇮🇹
|
59 |
|
60 |
+
This repository contains the checkpoint for the [mT5 Small](https://huggingface.co/google/mt5-small) model fine-tuned on news headline generation on the Italian HeadGen-IT dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
|
61 |
|
62 |
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
|
63 |
|
|
|
86 |
|
87 |
```bibtex
|
88 |
@article{sarti-nissim-2022-it5,
|
89 |
+
title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
|
90 |
author={Sarti, Gabriele and Nissim, Malvina},
|
91 |
+
journal={ArXiv preprint 2203.03759},
|
92 |
+
url={https://arxiv.org/abs/2203.03759},
|
93 |
+
year={2022},
|
94 |
+
month={mar}
|
95 |
}
|
96 |
```
|