File size: 726 Bytes
715b3bf a91e4e5 715b3bf a91e4e5 715b3bf a91e4e5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
**LOLA**: Large and Open Source Multilingual Language Model
## Model Description
This is a fine-tuned version of [dice-research/lola_v1](https://huggingface.co/dice-research/lola_v1) trained on [multilingual Alpaca](https://arxiv.org/abs/2309.08958) dataset for 2 epochs. The training dataset can be found here: https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/training-data. The following languages are covered:
Bulgarian (bg), Bengali (bn), Czech (cs), Spanish (es), Finnish (fi), French (fr), Hindi (hi), Norwegian (no), Russian (ru), and Chinese (zh). |