mbrack commited on
Commit
50dbabc
1 Parent(s): cf17afa

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +100 -0
README.md ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ - fr
6
+ pipeline_tag: text-generation
7
+ ---
8
+
9
+ ![image/png](https://huggingface.co/datasets/malteos/images/resolve/main/occiglot.medium.png)
10
+
11
+ # Occiglot-7B-FR-EN
12
+
13
+ > A [polyglot](https://en.wikipedia.org/wiki/Multilingualism#In_individuals) language model for the [Occident](https://en.wikipedia.org/wiki/Occident).
14
+ >
15
+
16
+ **Occiglot-7B-FR-EN** is a generative language model with 7B parameters for Spanish and English and trained by the [Occiglot Research Collective](https://ociglot.eu).
17
+ It is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) and trained on 113B tokens of additional multilingual and code data with a block size of 8,192 tokens per sample.
18
+ Note that the model is a general-purpose base model and was not instruction-fine-tuned nor optimized for chat or other applications. We make an instruction tuned variant available as [occiglot-7b-fr-en-instruct](https://huggingface.co/occiglot/occiglot-7b-fr-en-instruct)
19
+
20
+ This is the first release of an ongoing open research project for multilingual language models.
21
+ If you want to train a model for your own language or are working on evaluations, please contact us or join our [Discord server](https://discord.gg/wUpvYs4XvM). **We are open for collaborations!**
22
+
23
+
24
+ ### Model details
25
+
26
+ - **Continued-pretraining from:** [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
27
+ - **Model type:** Causal decoder-only transformer language model
28
+ - **Languages:** English, German, and code.
29
+ - **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.html)
30
+ - **Compute resources:** [HessianAI's 42](https://hessian.ai/)
31
+ - **Contributors:** Manuel Brack, Patrick Schramowski, Pedro Ortiz, Malte Ostendorff, Fabio Barth, Georg Rehm, Kristian Kersting
32
+ - **Research labs:** [Occiglot](https://ociglot.eu) with support from [SAINT](https://www.dfki.de/en/web/research/research-departments/foundations-of-systems-ai) and [SLT](https://www.dfki.de/en/web/research/research-departments/speech-and-language-technology)
33
+ - **Contact:** [Discord](https://discord.gg/wUpvYs4XvM) [[email protected]](mailto:[email protected])
34
+
35
+ ### How to use
36
+
37
+ You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we
38
+ set a seed for reproducibility:
39
+
40
+ ```python
41
+ >>> from transformers import pipeline, set_seed
42
+ >>> generator = pipeline('text-generation', model='occiglot/occiglot-7b-eu5')
43
+ >>> set_seed(42)
44
+ >>> generator("Hallo, Ich bin ein Sprachmodell,", max_length=40, num_return_sequences=1)
45
+ [{'generated_text': 'Hallo, Ich bin ein Sprachmodell, das dir bei der Übersetzung von Texten zwischen Deutsch und Englisch helfen kann. Wenn du mir einen Text in Deutsch'}]
46
+ ```
47
+
48
+ ## Dataset
49
+
50
+ The training data is the respective subset of the data used for [occiglot-7b-eu5](https://huggingface.co/occiglot/occiglot-7b-eu5), i.e. Spanish plus English and Code.
51
+
52
+ The data distribution by language (estimated) is as follows:
53
+ - English: ~34%
54
+ - Code: ~13%
55
+ - French: ~52%
56
+
57
+ The training data was prepared using [lm-datasets](https://github.com/malteos/lm-datasets).
58
+ The exact data configuration is [here](https://huggingface.co/occiglot/occiglot-7b-eu5/blob/main/lm-datasets-config.yml).
59
+
60
+ ## Training settings
61
+
62
+ - Continual pre-training on 128 x A100-80GB on [HessianAI's 42](https://hessian.ai/).
63
+ - Framework: [Determined](https://www.determined.ai/)
64
+ - Precision: bf16
65
+ - Optimizer: AdamW (lr: 0.00001, warmup_steps: 420)
66
+ - Global batch size: 512 (with 8192 blocksize) split over 128 GPUs
67
+ - Cosine Annealing with Warmup
68
+
69
+
70
+ ## Tokenizer
71
+
72
+ Tokenizer is unchanged from [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1).
73
+
74
+ ## Evaluation
75
+
76
+ Preliminary evaluation results can be found below.
77
+ Please note that the non-English results are based on partially machine-translated datasets and English prompts ([Belebele](https://huggingface.co/datasets/facebook/belebele) and [Okapi framework](https://github.com/nlp-uoregon/Okapi)) and thus should be interpreted with caution, e.g., biased towards English model performance.
78
+ Currently, we are working on more suitable benchmarks for Spanish, French, German, and Italian.
79
+
80
+ <details>
81
+ <summary>Evaluation results</summary>
82
+
83
+
84
+ </details>
85
+
86
+ ## Acknowledgements
87
+
88
+ The model training was supported by a compute grant at the [42 supercomputer](https://hessian.ai/) which is a central component in the development of [hessian AI](https://hessian.ai/), the [AI Innovation Lab](https://hessian.ai/infrastructure/ai-innovationlab/) (funded by the [Hessian Ministry of Higher Education, Research and the Art (HMWK)](https://wissenschaft.hessen.de) & the [Hessian Ministry of the Interior, for Security and Homeland Security (HMinD)](https://innen.hessen.de)) and the [AI Service Centers](https://hessian.ai/infrastructure/ai-service-centre/) (funded by the [German Federal Ministry for Economic Affairs and Climate Action (BMWK)](https://www.bmwk.de/Navigation/EN/Home/home.html)).
89
+ The curation of the training data is partially funded by the [German Federal Ministry for Economic Affairs and Climate Action (BMWK)](https://www.bmwk.de/Navigation/EN/Home/home.html)
90
+ through the project [OpenGPT-X](https://opengpt-x.de/en/) (project no. 68GX21007D).
91
+
92
+
93
+ ## License
94
+
95
+ [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.html)
96
+
97
+ ## See also
98
+
99
+ - https://huggingface.co/collections/occiglot/occiglot-eu5-7b-v01-65dbed502a6348b052695e01
100
+