elmadany commited on
Commit
9c52252
1 Parent(s): 9e42182

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -9
README.md CHANGED
@@ -11,7 +11,7 @@ tags:
11
 
12
  <img src="https://huggingface.co/UBC-NLP/AraT5-base/resolve/main/AraT5_CR_new.png" alt="AraT5" width="45%" height="35%" align="right"/>
13
 
14
- This is the repository accompanying our paper [AraT5: Text-to-Text Transformers for Arabic Language Understanding and Generation](https://arxiv.org/abs/2109.12068). In this is the repository we Introduce **AraT5<sub>MSA</sub>**, **AraT5<sub>Tweet</sub>**, and **AraT5**: three powerful Arabic-specific text-to-text Transformer based models;
15
 
16
 
17
  ---
@@ -57,14 +57,19 @@ AraT5 Pytorch and TensorFlow checkpoints are available on the Huggingface websit
57
 
58
  If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
59
  ```bibtex
60
- @inproceedings{nagoudi2022_arat5,
61
- title={AraT5: Text-to-Text Transformers for Arabic Language Generation},
62
- author={Nagoudi, El Moatez Billah and Elmadany, AbdelRahim and Abdul-Mageed, Muhammad},
63
- journal={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistic},
64
- month = {May},
65
- address = {Online},
66
- year={2022},
67
- publisher = {Association for Computational Linguistics}
 
 
 
 
 
68
  }
69
  ```
70
 
 
11
 
12
  <img src="https://huggingface.co/UBC-NLP/AraT5-base/resolve/main/AraT5_CR_new.png" alt="AraT5" width="45%" height="35%" align="right"/>
13
 
14
+ This is the repository accompanying our paper [AraT5: Text-to-Text Transformers for Arabic Language Understanding and Generation](https://aclanthology.org/2022.acl-long.47/). In this is the repository we Introduce **AraT5<sub>MSA</sub>**, **AraT5<sub>Tweet</sub>**, and **AraT5**: three powerful Arabic-specific text-to-text Transformer based models;
15
 
16
 
17
  ---
 
57
 
58
  If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
59
  ```bibtex
60
+ @inproceedings{nagoudi-etal-2022-arat5,
61
+ title = "{A}ra{T}5: Text-to-Text Transformers for {A}rabic Language Generation",
62
+ author = "Nagoudi, El Moatez Billah and
63
+ Elmadany, AbdelRahim and
64
+ Abdul-Mageed, Muhammad",
65
+ booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
66
+ month = may,
67
+ year = "2022",
68
+ address = "Dublin, Ireland",
69
+ publisher = "Association for Computational Linguistics",
70
+ url = "https://aclanthology.org/2022.acl-long.47",
71
+ pages = "628--647",
72
+ abstract = "Transfer learning with a unified Transformer framework (T5) that converts all language problems into a text-to-text format was recently proposed as a simple and effective transfer learning approach. Although a multilingual version of the T5 model (mT5) was also introduced, it is not clear how well it can fare on non-English tasks involving diverse data. To investigate this question, we apply mT5 on a language with a wide variety of dialects{--}Arabic. For evaluation, we introduce a novel benchmark for ARabic language GENeration (ARGEN), covering seven important tasks. For model comparison, we pre-train three powerful Arabic T5-style models and evaluate them on ARGEN. Although pre-trained with {\textasciitilde}49 less data, our new models perform significantly better than mT5 on all ARGEN tasks (in 52 out of 59 test sets) and set several new SOTAs. Our models also establish new SOTA on the recently-proposed, large Arabic language understanding evaluation benchmark ARLUE (Abdul-Mageed et al., 2021). Our new models are publicly available. We also link to ARGEN datasets through our repository: https://github.com/UBC-NLP/araT5.",
73
  }
74
  ```
75