julien-c HF staff commited on
Commit
4c23cbd
1 Parent(s): 4d00433

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/akhooli/gpt2-small-arabic/README.md

Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: "ar"
3
+ datasets:
4
+ - Arabic Wikipedia
5
+ metrics:
6
+ - none
7
+ ---
8
+
9
+ # GPT2-Small-Arabic
10
+
11
+ ## Model description
12
+
13
+ GPT2 model from Arabic Wikipedia dataset based on gpt2-small (using Fastai2).
14
+
15
+ ## Intended uses & limitations
16
+
17
+ #### How to use
18
+
19
+ An example is provided in this [colab notebook](https://colab.research.google.com/drive/1mRl7c-5v-Klx27EEAEOAbrfkustL4g7a?usp=sharing).
20
+ Both text and poetry (fine-tuned model) generation are included.
21
+
22
+ #### Limitations and bias
23
+
24
+ GPT2-small-arabic (trained on Arabic Wikipedia) has several limitations in terms of coverage (Arabic Wikipeedia quality, no diacritics) and training performance.
25
+ Use as demonstration or proof of concepts but not as production code.
26
+
27
+ ## Training data
28
+
29
+ This pretrained model used the Arabic Wikipedia dump (around 900 MB).
30
+
31
+ ## Training procedure
32
+
33
+ Training was done using [Fastai2](https://github.com/fastai/fastai2/) library on Kaggle, using free GPU.
34
+
35
+ ## Eval results
36
+ Final perplexity reached was 72.19, loss: 4.28, accuracy: 0.307
37
+
38
+ ### BibTeX entry and citation info
39
+
40
+ ```bibtex
41
+ @inproceedings{Abed Khooli,
42
+ year={2020}
43
+ }
44
+ ```