File size: 1,781 Bytes
78e44d1 cdc598a 78e44d1 f9981d0 78e44d1 cb981d4 78e44d1 cb981d4 78e44d1 f9981d0 78e44d1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 |
---
language: fr
license: mit
tags:
- pytorch
- causal-lm
datasets:
- c4
---
# Cedille AI
Cedille is a project to bring large language models to non-English languages.
## fr-boris
Boris is a 6B parameter autoregressive language model based on the GPT-J architecture and trained using the [mesh-transformer-jax](https://github.com/kingoflolz/mesh-transformer-jax) codebase.
Boris was trained on around 78B tokens of French text from the [C4](https://huggingface.co/datasets/c4) dataset. We started training from GPT-J, which has been trained on [The Pile](https://pile.eleuther.ai/). As a consequence the model still has good performance in English language. Boris makes use of the unmodified GPT-2 tokenizer.
Boris is named after the great French writer [Boris Vian](https://en.wikipedia.org/wiki/Boris_Vian).
# How do I test Cedille?
For the time being, the easiest way to test the model is to use our [publicly accessible playground](https://en.cedille.ai/).
Cedille is a relatively large model and running it in production can get expensive. Consider contacting us for API access at [email protected].
## 📊 Cedille paper
Our paper is out now! https://arxiv.org/abs/2202.03371
Thanks for citing our work if you make use of Cedille
```bibtex
@misc{muller2022cedille,
title={Cedille: A large autoregressive French language model},
author={Martin M{\"{u}}ller and Florian Laurent},
year={2022},
eprint={2202.03371},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Contact us
For any custom development please contact us at [email protected].
## Links
* [Official website](https://en.cedille.ai/)
* [Blog](https://en.cedille.ai/blog)
* [GitHub](https://github.com/coteries/cedille-ai)
* [Twitter](https://twitter.com/CedilleAI)
|