File size: 726 Bytes
b7922e4 1fb1a13 49d3219 b7922e4 1fb1a13 1df3f80 1fb1a13 1df3f80 1fb1a13 1df3f80 e80bf99 1fb1a13 1df3f80 1fb1a13 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
---
license: apache-2.0
datasets:
- Gustrd/dolly-15k-libretranslate-pt
library_name: peft
language:
- pt
---
this adapter model using (peft) was made on top of openlm-research/open_llama_3b_v2 (https://huggingface.co/openlm-research/open_llama_3b_v2)
it's not perfect in portuguese, but in the perfect point to train a bit more for specific task in this language.
consider check the jupyter notebooks in the files section for more info.
these notebooks were get from web and are very similar to "cabrita" model, that was made on top of llama1.
trained in only 120 steps and with some results very similar to VMware/open-llama-13b-open-instruct
maybe necessary to adjust the parameters of inference to make it work better. |