goldfish-models commited on
Commit
5429f48
1 Parent(s): 40427ce

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -2,8 +2,8 @@
2
  ---
3
  license: apache-2.0
4
  language:
5
- - ory
6
  - ori
 
7
  datasets:
8
  - cis-lmu/Glot500
9
  - allenai/nllb
@@ -13,7 +13,7 @@ library_name: transformers
13
  pipeline_tag: text-generation
14
  tags:
15
  - goldfish
16
-
17
  ---
18
 
19
  # ori_orya_100mb
@@ -24,7 +24,7 @@ The Goldfish models are trained primarily for comparability across languages and
24
 
25
  Note: ori_orya is a [macrolanguage](https://iso639-3.sil.org/code_tables/639/data) code. None of its contained individual languages are included in Goldfish (for script orya).
26
 
27
- All training and hyperparameter details are in our paper, [Goldfish: Monolingual Language Models for 350 Languages (Chang et al., 2024)](https://github.com/tylerachang/goldfish/blob/main/goldfish_paper_20240815.pdf).
28
 
29
  Training code and sample usage: https://github.com/tylerachang/goldfish
30
 
@@ -34,6 +34,7 @@ Sample usage also in this Google Colab: [link](https://colab.research.google.com
34
 
35
  To access all Goldfish model details programmatically, see https://github.com/tylerachang/goldfish/blob/main/model_details.json.
36
  All models are trained with a [CLS] (same as [BOS]) token prepended, and a [SEP] (same as [EOS]) token separating sequences.
 
37
  Details for this model specifically:
38
 
39
  * Architecture: gpt2
@@ -64,5 +65,6 @@ If you use this model, please cite:
64
  author={Chang, Tyler A. and Arnett, Catherine and Tu, Zhuowen and Bergen, Benjamin K.},
65
  journal={Preprint},
66
  year={2024},
 
67
  }
68
  ```
 
2
  ---
3
  license: apache-2.0
4
  language:
 
5
  - ori
6
+ - ory
7
  datasets:
8
  - cis-lmu/Glot500
9
  - allenai/nllb
 
13
  pipeline_tag: text-generation
14
  tags:
15
  - goldfish
16
+ - arxiv:2408.10441
17
  ---
18
 
19
  # ori_orya_100mb
 
24
 
25
  Note: ori_orya is a [macrolanguage](https://iso639-3.sil.org/code_tables/639/data) code. None of its contained individual languages are included in Goldfish (for script orya).
26
 
27
+ All training and hyperparameter details are in our paper, [Goldfish: Monolingual Language Models for 350 Languages (Chang et al., 2024)](https://www.arxiv.org/abs/2408.10441).
28
 
29
  Training code and sample usage: https://github.com/tylerachang/goldfish
30
 
 
34
 
35
  To access all Goldfish model details programmatically, see https://github.com/tylerachang/goldfish/blob/main/model_details.json.
36
  All models are trained with a [CLS] (same as [BOS]) token prepended, and a [SEP] (same as [EOS]) token separating sequences.
37
+ For best results, make sure that [CLS] is prepended to your input sequence (see sample usage linked above)!
38
  Details for this model specifically:
39
 
40
  * Architecture: gpt2
 
65
  author={Chang, Tyler A. and Arnett, Catherine and Tu, Zhuowen and Bergen, Benjamin K.},
66
  journal={Preprint},
67
  year={2024},
68
+ url={https://www.arxiv.org/abs/2408.10441},
69
  }
70
  ```