Papers
arxiv:2305.11129

mLongT5: A Multilingual and Efficient Text-To-Text Transformer for Longer Sequences

Published on May 18, 2023
· Submitted by akhaliq on May 19, 2023
Authors:
,
,
,

Abstract

We present our work on developing a multilingual, efficient text-to-text transformer that is suitable for handling long inputs. This model, called mLongT5, builds upon the architecture of LongT5, while leveraging the multilingual datasets used for pretraining mT5 and the pretraining tasks of UL2. We evaluate this model on a variety of multilingual summarization and question-answering tasks, and the results show stronger performance for mLongT5 when compared to existing multilingual models such as mBART or M-BERT.

Community

deleted
This comment has been hidden

Sign up or log in to comment

Models citing this paper 3

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2305.11129 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2305.11129 in a Space README.md to link it from this page.

Collections including this paper 2