runtime error
Exit code: 1. Reason: le. Please open an issue on GitHub for any issues related to this experimental feature. state_dict = torch.load(pretrained_model_path_or_dict, map_location=pipe.device) sdxl_instantir.py :376 2024-11-11 14:06:41,432 use lora alpha 1 sdxl_instantir.py :376 2024-11-11 14:06:53,048 use lora alpha 8.0 tuners_utils.py :171 2024-11-11 14:06:53,049 Already found a `peft_config` attribute in the model. This will lead to having multiple adapters in the model. Make sure to know what you are doing! Traceback (most recent call last): File "/content/InstantIR-hf/worker_runpod.py", line 64, in <module> pipe.to(device=device, dtype=torch_dtype) File "/home/camenduru/.local/lib/python3.10/site-packages/diffusers/pipelines/pipeline_utils.py", line 454, in to module.to(device, dtype) File "/home/camenduru/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3157, in to return super().to(*args, **kwargs) File "/home/camenduru/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1340, in to return self._apply(convert) File "/home/camenduru/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 900, in _apply module._apply(fn) File "/home/camenduru/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 900, in _apply module._apply(fn) File "/home/camenduru/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 900, in _apply module._apply(fn) File "/home/camenduru/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 927, in _apply param_applied = fn(param) File "/home/camenduru/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1326, in convert return t.to( File "/home/camenduru/.local/lib/python3.10/site-packages/torch/cuda/__init__.py", line 319, in _lazy_init torch._C._cuda_init() RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx
Container logs:
Fetching error logs...