Issue with downloading using huggingface

#14
by engineering-lamini - opened

Using AutoModelForCausalLM to load the model gives me this rope_scaling dictionary error. Is there any way to fix this?

model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")

File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 182, in _rope_scaling_validation
    raise ValueError(
ValueError: `rope_scaling` must be a dictionary with two fields, `type` and `factor`, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}

I tried passing trust_remote_code=True, but that didnt work

I need a visual guide to download and use these

Guys, remember: whenever a new model is released, don't rush to download it. First, update your transformers library using '%pip install --upgrade transformers -q', then proceed!

Using AutoModelForCausalLM to load the model gives me this rope_scaling dictionary error. Is there any way to fix this?

model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")

File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 182, in _rope_scaling_validation
    raise ValueError(
ValueError: `rope_scaling` must be a dictionary with two fields, `type` and `factor`, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
Meta Llama org

Indeed, this is fixed in the latest transformers version as mentioned in the model card. Please upgrade and enjoy!

How does one upgrade transformers if running the script to serve this via TGI docker?

@osanseviero thanks. Verified that it works with transformer 4.43.1

osanseviero changed discussion status to closed

Sign up or log in to comment