runtime error

Exit code: 1. Reason: d and see if you can locate CUDA libraries. You might need to add them to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 14, in <module> from llava.constants import LOGDIR File "/home/user/app/llava/__init__.py", line 1, in <module> from .model import LlavaLlamaForCausalLM File "/home/user/app/llava/model/__init__.py", line 1, in <module> from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig File "/home/user/app/llava/model/language_model/llava_llama.py", line 22, in <module> from transformers import AutoConfig, AutoModelForCausalLM, \ File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1090, in __getattr__ value = getattr(module, name) File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1089, in __getattr__ module = self._get_module(self._class_to_module[name]) File "/home/user/.pyenv/versions/3.10.14/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1101, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback): CUDA Setup failed despite GPU being available. Please run the following command to get more information: python -m bitsandbytes Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues

Container logs:

Fetching error logs...