runtime error

Exit code: 1. Reason: The token has not been saved to the git credentials helper. Pass `add_to_git_credential=True` in this function directly or `--add-to-git-credential` if using via `huggingface-cli` if you want to set the git credential as well. Token is valid (permission: fineGrained). Your token has been saved to /home/user/.cache/huggingface/token Login successful Traceback (most recent call last): File "/home/user/app/app.py", line 11, in <module> tokenizer, model = load_peft_model_and_tokenizer(PEFT_MODEL, BASE_MODEL) File "/home/user/app/main.py", line 28, in load_peft_model_and_tokenizer base_model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3398, in from_pretrained hf_quantizer.validate_environment( File "/usr/local/lib/python3.10/site-packages/transformers/quantizers/quantizer_bnb_4bit.py", line 62, in validate_environment raise RuntimeError("No GPU found. A GPU is needed for quantization.") RuntimeError: No GPU found. A GPU is needed for quantization.

Container logs:

Fetching error logs...