mihaimasala's picture
Update README.md
80a78b9 verified
|
raw
history blame
4.58 kB
---
license: cc-by-nc-4.0
language:
- ro
base_model:
- meta-llama/Meta-Llama-3-8B
---
# Model Card for Model ID
*Built with Meta Llama 3*
<!-- Provide a quick summary of what the model is/does. -->
RoLlama3 is a family of pretrained and fine-tuned generative text models for Romanian. This is the repository for the **instruct 7B model**. Links to other models can be found at the bottom of this page.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
OpenLLM-Ro represents the first open-source effort to build a LLM specialized for Romanian. OpenLLM-Ro developed and publicly releases a collection of Romanian LLMs, both in the form of foundational model and instruct and chat variants.
- **Developed by:** OpenLLM-Ro
<!-- - **Funded by [optional]:** [More Information Needed] -->
<!-- - **Shared by [optional]:** [More Information Needed] -->
<!-- - **Model type:** [More Information Needed] -->
- **Language(s):** Romanian
- **License:** cc-by-nc-4.0
- **Finetuned from model:** [Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/OpenLLM-Ro/llama-recipes
- **Paper:** https://arxiv.org/abs/2406.18266
## Intended Use
### Intended Use Cases
RoLlama3 is intented for research use in Romanian. Base models can be adapted for a variety of natural language tasks while instruction and chat tuned models are intended for assistant-like chat.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
Use in any manner that violates the license, any applicable laws or regluations, use in languages other than Romanian.
## How to Get Started with the Model
Use the code below to get started with the model.
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("OpenLLM-Ro/RoLlama3-8b-Instruct")
model = AutoModelForCausalLM.from_pretrained("OpenLLM-Ro/RoLlama3-8b-Instruct")
instruction = "Ce jocuri de societate pot juca cu prietenii mei?"
chat = [
{"role": "system", "content": "Ești un asistent folositor, respectuos și onest. Încearcă să ajuți cât mai mult prin informațiile oferite, excluzând răspunsuri toxice, rasiste, sexiste, periculoase și ilegale."},
{"role": "user", "content": instruction},
]
prompt = tokenizer.apply_chat_template(chat, tokenize=False, system_message="")
inputs = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt")
outputs = model.generate(input_ids=inputs, max_new_tokens=128)
print(tokenizer.decode(outputs[0]))
```
## Benchmarks
| Model | Average | ARC | MMLU |Winogrande|HellaSwag | GSM8k |TruthfulQA|
|--------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| Llama-3-8B-Instruct| 50.15 | 43.73 | 49.02 | 59.35 | 53.16 | **44.12** | 51.52 |
| *RoLlama3-8b-Instruct* | ***50.61*** | ***44.66*** | ***52.19*** | ***67.58*** | ***57.65*** | *30.20* | ***51.39*** |
## MT-Bench
| Model | Average | 1st turn | 2nd turn | Answers in Ro |
|--------------------|:--------:|:--------:|:--------:|:--------:|
| Llama-3-8B-Instruct | **5.92** | **6.36** | **5.49** | 158 / 160
| *RoLlama3-8b-Instruct*| *5.28* |*6.10*| *4.45* | ***160 / 160*** |
## RoCulturaBench
| Model | Score | Answers in Ro|
|--------------------|:--------:|:--------:|
| Llama-3-8B-Instruct | **4.61** | **100 / 100** |
| *RoLlama3-8b-Instruct*| *3.83*| ***100 / 100*** |
## RoLlama3 Model Family
| Model | Link |
|--------------------|:--------:|
|*RoLlama3-8b-Instruct*| [link](https://huggingface.co/OpenLLM-Ro/RoLlama3-8b-Instruct) |
## Citation
```
@misc{masala2024vorbecstiromanecsterecipetrain,
title={"Vorbe\c{s}ti Rom\^ane\c{s}te?" A Recipe to Train Powerful Romanian LLMs with English Instructions},
author={Mihai Masala and Denis C. Ilie-Ablachim and Alexandru Dima and Dragos Corlatescu and Miruna Zavelca and Ovio Olaru and Simina Terian-Dan and Andrei Terian-Dan and Marius Leordeanu and Horia Velicu and Marius Popescu and Mihai Dascalu and Traian Rebedea},
year={2024},
eprint={2406.18266},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2406.18266},
}
```
<!-- **APA:**
[More Information Needed] -->