runtime error
face.co/meta-llama/Llama-2-7b-chat-hf/resolve/ffe969f685730bb46ff086ab40598cced8e9c799/config.json. Repo model meta-llama/Llama-2-7b-chat-hf is gated. You must be authenticated to access it. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/transformers/src/transformers/pipelines/base.py", line 279, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/modeling_utils.py", line 2983, in from_pretrained config, model_kwargs = cls.config_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 602, in from_pretrained config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "/src/transformers/src/transformers/utils/hub.py", line 416, in cached_file raise EnvironmentError( OSError: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/meta-llama/Llama-2-7b-chat-hf. 401 Client Error. (Request ID: Root=1-661af3bd-00f3ef3d203f62d4601cc2c6;e4b8ffdf-a0b5-45e7-9fce-9da521d287f0) Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-2-7b-chat-hf/resolve/ffe969f685730bb46ff086ab40598cced8e9c799/config.json. Repo model meta-llama/Llama-2-7b-chat-hf is gated. You must be authenticated to access it.
Container logs:
Fetching error logs...