Language support

#1
by ashbo - opened

Hi everyone,
The model card for this model states, that it only supports english despite llama-3.1 originally supporting 8 different languages.
I was wondering if there was any experience of this model's performance in one of the other languages llama-3.1 is originally capable of. Were these skills compromised during quantization? Can we expect the same skill as originally?

Neural Magic org

Hi @ashbo , thank you for pointing this out. This is a mistype in the model cards and we'll put up a fix for the Llama 3.1 models. We have not explicitly evaluated across other languages yet, but the model's should all still preserve their original multi-language capabilities.

Sign up or log in to comment