runtime error
2-70b-chat-hf/resolve/2bd046f8e48458563e017474661d629736bbb78c/config.json. Access to model meta-llama/Llama-2-70b-chat-hf is restricted. You must be authenticated to access it. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/transformers/src/transformers/pipelines/base.py", line 279, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/modeling_utils.py", line 2983, in from_pretrained config, model_kwargs = cls.config_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 602, in from_pretrained config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "/src/transformers/src/transformers/utils/hub.py", line 416, in cached_file raise EnvironmentError( OSError: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/meta-llama/Llama-2-70b-chat-hf. 401 Client Error. (Request ID: Root=1-663f227f-7b6d20c33b4e31be0b2962a7;197fa13b-8d99-46fb-a091-3488d61b1b56) Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-2-70b-chat-hf/resolve/2bd046f8e48458563e017474661d629736bbb78c/config.json. Access to model meta-llama/Llama-2-70b-chat-hf is restricted. You must be authenticated to access it.
Container logs:
Fetching error logs...