julien-c HF staff commited on
Commit
05ddb2b
1 Parent(s): f88b9e6

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/youscan/ukr-roberta-base/README.md

Files changed (1) hide show
  1. README.md +26 -0
README.md ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - uk
4
+ ---
5
+
6
+ # ukr-roberta-base
7
+
8
+ ## Pre-training corpora
9
+ Below is the list of corpora used along with the output of wc command (counting lines, words and characters). These corpora were concatenated and tokenized with HuggingFace Roberta Tokenizer.
10
+
11
+ | Tables | Lines | Words | Characters |
12
+ | ------------- |--------------:| -----:| -----:|
13
+ | [Ukrainian Wikipedia - May 2020](https://dumps.wikimedia.org/ukwiki/latest/ukwiki-latest-pages-articles.xml.bz2) | 18 001 466| 201 207 739 | 2 647 891 947 |
14
+ | [Ukrainian OSCAR deduplicated dataset](https://oscar-public.huma-num.fr/shuffled/uk_dedup.txt.gz) | 56 560 011 | 2 250 210 650 | 29 705 050 592 |
15
+ | Sampled mentions from social networks | 11 245 710 | 128 461 796 | 1 632 567 763 |
16
+ | Total | 85 807 187 | 2 579 880 185 | 33 985 510 302 |
17
+
18
+ ## Pre-training details
19
+
20
+ * Ukrainian Roberta was trained with code provided in [HuggingFace tutorial](https://huggingface.co/blog/how-to-train)
21
+ * Currently released model follows roberta-base-cased model architecture (12-layer, 768-hidden, 12-heads, 125M parameters)
22
+ * The model was trained on 4xV100 (85 hours)
23
+ * Training configuration you can find in the [original repository](https://github.com/youscan/language-models)
24
+
25
+ ## Author
26
+ Vitalii Radchenko - contact me on Twitter [@vitaliradchenko](https://twitter.com/vitaliradchenko)