runtime error
�███▉| 4.62G/4.66G [01:52<00:00, 43.2MB/s][A Downloading (…)l-00002-of-00002.bin: 100%|█████████▉| 4.63G/4.66G [01:52<00:00, 42.8MB/s][A Downloading (…)l-00002-of-00002.bin: 100%|█████████▉| 4.65G/4.66G [01:52<00:00, 47.9MB/s][A Downloading (…)l-00002-of-00002.bin: 100%|█████████▉| 4.66G/4.66G [01:53<00:00, 52.3MB/s][A Downloading (…)l-00002-of-00002.bin: 100%|██████████| 4.66G/4.66G [01:53<00:00, 41.1MB/s] Downloading shards: 100%|██████████| 2/2 [05:58<00:00, 167.84s/it] Downloading shards: 100%|██████████| 2/2 [05:58<00:00, 179.49s/it] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 0%| | 0/2 [01:03<?, ?it/s] Traceback (most recent call last): File "app.py", line 18, in <module> m = AutoModelForCausalLM.from_pretrained("stabilityai/stablelm-tuned-alpha-3b", device_map= "auto", quantization_config=quantization_config, File "/home/user/.local/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 565, in from_pretrained return model_class.from_pretrained( File "/home/user/.local/lib/python3.8/site-packages/transformers/modeling_utils.py", line 3307, in from_pretrained ) = cls._load_pretrained_model( File "/home/user/.local/lib/python3.8/site-packages/transformers/modeling_utils.py", line 3695, in _load_pretrained_model new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model( File "/home/user/.local/lib/python3.8/site-packages/transformers/modeling_utils.py", line 749, in _load_state_dict_into_meta_model set_module_quantized_tensor_to_device( File "/home/user/.local/lib/python3.8/site-packages/transformers/integrations/bitsandbytes.py", line 58, in set_module_quantized_tensor_to_device if old_value.device == torch.device("meta") and device not in ["meta", torch.device("meta")] and value is None: NameError: name 'torch' is not defined
Container logs:
Fetching error logs...