|
--- |
|
library_name: transformers |
|
tags: [] |
|
--- |
|
|
|
# Model Card for Model ID |
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
**LOLA**: Large and Open Source Multilingual Language Model |
|
|
|
## Model Description |
|
|
|
This is a fine-tuned version of [dice-research/lola_v1](https://huggingface.co/dice-research/lola_v1) trained on [multilingual Alpaca](https://arxiv.org/abs/2309.08958) dataset for 2 epochs. The training dataset can be found here: https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/training-data. The following languages are covered: |
|
Bulgarian (bg), Bengali (bn), Czech (cs), Spanish (es), Finnish (fi), French (fr), Hindi (hi), Norwegian (no), Russian (ru), and Chinese (zh). |