"KeyError after obtaining model permission and successfully logging into Hugging Face in Python

#12
by SuMeyYao - opened

Here is the input content:
login(token="myhuggingfacetoken")
model_name = "meta-llama/Meta-Llama-Guard-2-8B"
tokenizer = AutoTokenizer.from_pretrained(model_name, use_auth_token=True, trust_remote_code=True)
model = AutoModel.from_pretrained(model_name, use_auth_token=True, trust_remote_code=True)
print("Model and tokenizer loaded successfully!")

The error is as follows:

KeyError Traceback (most recent call last)
Input In [13], in <cell line: 9>()
7 model_name = "meta-llama/Meta-Llama-Guard-2-8B"
8 tokenizer = AutoTokenizer.from_pretrained(model_name, use_auth_token=True, trust_remote_code=True)
----> 9 model = AutoModel.from_pretrained(model_name, use_auth_token=True, trust_remote_code=True)
10 print("Model and tokenizer loaded successfully!")

File /cluster/apps/nss/gcc-8.2.0/python/3.10.4/x86_64/lib64/python3.10/site-packages/transformers/models/auto/auto_factory.py:423, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
421 kwargs["_from_auto"] = True
422 if not isinstance(config, PretrainedConfig):
--> 423 config, kwargs = AutoConfig.from_pretrained(
424 pretrained_model_name_or_path, return_unused_kwargs=True, trust_remote_code=trust_remote_code, **kwargs
425 )
426 if hasattr(config, "auto_map") and cls.name in config.auto_map:
427 if not trust_remote_code:

File /cluster/apps/nss/gcc-8.2.0/python/3.10.4/x86_64/lib64/python3.10/site-packages/transformers/models/auto/configuration_auto.py:700, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
698 return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs)
699 elif "model_type" in config_dict:
--> 700 config_class = CONFIG_MAPPING[config_dict["model_type"]]
701 return config_class.from_dict(config_dict, **kwargs)
702 else:
703 # Fallback: use pattern matching on the string.

File /cluster/apps/nss/gcc-8.2.0/python/3.10.4/x86_64/lib64/python3.10/site-packages/transformers/models/auto/configuration_auto.py:409, in _LazyConfigMapping.getitem(self, key)
407 return self._extra_content[key]
408 if key not in self._mapping:
--> 409 raise KeyError(key)
410 value = self._mapping[key]
411 module_name = model_type_to_module_name(key)

KeyError: 'llama'

Could you please provide any suggestions for modifications?

Same error occured when input the following codes by the way :
"from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_id = "meta-llama/Meta-Llama-Guard-2-8B"
device = "cuda"
dtype = torch.bfloat16
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=dtype, device_map=device)
"

Meta Llama org

Hi there! Please make sure to use a recent transformers version

osanseviero changed discussion status to closed

Sign up or log in to comment