Denied permission to DL
HF indicates "Gated model You have been granted access to this model"
However am denied permission when trying to to DL with :
huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct --include "original/*" --local-dir meta-llama/Meta-Llama-3-8B-Instruct
or with using the 'transformers' script
Denial says: :Cannot access gated repo for url https://huggingface.co/api/models/meta-llama/Meta-Llama-3-8B-Instruct/revision/main.
Repo model meta-llama/Meta-Llama-3-8B-Instruct is gated. You must be authenticated to access it.
Have already tried clearing HF cache hub, but same result.
Thanks for help.
You need to use your HF_Token while you try to access the model
add --token in your download command line
- Apply for the model access from the form they have shared.
- Make sure that you have the HF_TOKEN variable correct for the account that got acccess.
You define the HF_TOKEN value, and later you can use the following line of code:
!huggingface-cli login --token $HF_TOKEN
An example I used by adding access_token:
model_id = "meta-llama/Meta-Llama-3-8B-Instruct"
access_token = "hf_..."
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
token=access_token,
device_map="auto",
)
When I use:
An example I used by adding access_token:
model_id = "meta-llama/Meta-Llama-3-8B"
access_token = "hf_..."pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
token=access_token,
device_map="auto",
)
I still get the access error:
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.
I got access to the model 2 days ago, and I am using my account token. Do you know that else could be wrong?
@SofiaMO , please check that the access token you are using has read permissions. Keep in mind that there are also access tokens with write permissions and fine-grained (custom) permissions.
thanks it helped
Go back to their main page and accept the terms. Wait until it gets approved. Use the token as shown above.
try this:
HUGGINGFACEHUB_API_TOKEN =
pipeline("text-generation",
model="meta-llama/Meta-Llama-3.1-8B-Instruct",
model_kwargs={"torch_dtype": torch.bfloat16},
device_map= "auto",
token=HUGGINGFACEHUB_API_TOKEN
)
You can get the token from HF Platform, with a logged in account. Settings > Access Tokens > Create new token > Read > Write a token name > Copy the generated token
I have this same problem, everything worked fine until I moved my python inside a venv and ran my application from there. Changed the token, logged in with cli. Still have gated access. This is wild. Here is traceback:
File "/usr/projects/langface/FASTAPI/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling
result = func()
File "/usr/projects/langface/FASTAPI/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 590, in code_to_exec
exec(code, module.dict)
File "/usr/projects/langface/FASTAPI/chatbot.py", line 2, in
from back_chains import make_output, modify_output
File "/usr/projects/langface/FASTAPI/back_chains.py", line 44, in
model_config = transformers.AutoConfig.from_pretrained(model_id, token=hf_auth)
File "/usr/projects/langface/FASTAPI/venv/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 976, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/projects/langface/FASTAPI/venv/lib/python3.10/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/projects/langface/FASTAPI/venv/lib/python3.10/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict
resolved_config_file = cached_file(
File "/usr/projects/langface/FASTAPI/venv/lib/python3.10/site-packages/transformers/utils/hub.py", line 420, in cached_file
raise EnvironmentError(
I deleted and donwloaded the model to cache again, and noticed this -
it moved the model to my home directory versus the venv or working directory:
s]ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆâ–ˆ| 16.1G/16.1G [04:53<00:00, 110M
Download complete. Moving file to /home/mneely/.cache/huggingface/hub/models--meta-llama--Meta-Llama-3.1-8B-Instruct/blobs/