Fix running condition of translate_llama2

#16
by cieske - opened

Running recent Llama model failed with error like "Model meta-llama/Meta-Llama-3-8B-hf was not found on the Hub, please try another model name."

Function 'translate_llama2' is needed for Llama 2 Family because of there are both transformer version(model name ended with "-hf") and torch version exists, which is useless now due to the versions are integrated to one at Llama3 and recent family. I tried to change the execute condition of function 'translate_llama2' for apply to the Llama 2 family only.

good catch!

Vokturz changed pull request status to merged

Sign up or log in to comment