ArabicT5-17GB-large / README.md
sultan's picture
Create README.md
376c18d
|
raw
history blame
604 Bytes
# ArabicT5: Efficient Adaptation of T5 on Arabic Language
# Model Description
This model adapt T5 on Arabic Language by pre-training T5 on ArabicWikipedia, Marefa, and collection of Arabic News. Total Corpora size is 17GB. We restrict our corpora to News and Encyclopedias to enhance the performance of the model on informative tasks such as Factoid Question Answering and Generative task that uses classic Arabic ( الفصحى ). This model uses an efficient implementation of T5 which reduces the fine-tuning and memory used [Link](https://arxiv.org/abs/2109.10686)
Paper will be published soon ..