Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints

Model not loading

#1
by vdavidr - opened

Hello.

when i try to load the model i get "Error: Failed to parse file model-00014-of-00014.safetensors: failed to fetch safetensors header length."

LumiOpen org

Looks like there is an issue with the upload. I'll fix this today. In the mean time you can download the branch for the 1000B checkpoint, it looks like it's the most recent that made it.

LumiOpen org

This should be fixed now. Please comment if you encounter any further issues.

jonabur changed discussion status to closed

Thanks for resolving the original error jonabur.

I tried loading the model again but there's some unresolved git merge issues in the files causing this. I manually removed the conflicts from the json file but the .safetensors have the same problem too.

Running eval for model : LumiOpen/Viking-33B
Traceback (most recent call last):
File "/users/xxx/.local/lib/python3.9/site-packages/transformers/configuration_utils.py", line 722, in _get_config_dict
config_dict = cls._dict_from_json_file(resolved_config_file)
File "/users/xxx/.local/lib/python3.9/site-packages/transformers/configuration_utils.py", line 822, in _dict_from_json_file
return json.loads(text)
File "/usr/lib64/python3.9/json/init.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python3.9/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python3.9/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/scratch/xxx/xxx/generalEval.py", line 223, in
model = AutoModelForCausalLM.from_pretrained(model_name, device_map="auto", quantization_config=bnb_config)
File "/users/xxx/.local/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 523, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/users/xxx/.local/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 934, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/users/xxx/.local/lib/python3.9/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/users/xxx/.local/lib/python3.9/site-packages/transformers/configuration_utils.py", line 726, in _get_config_dict
raise EnvironmentError(
OSError: It looks like the config file at '/scratch/xxx/xxx/.cache/huggingface/hub/models--LumiOpen--Viking-33B/snapshots/2ae95c83f7f5e27e1ac3a4b498664432227f6da3/config.json' is not a valid JSON file.

jonabur changed discussion status to open
LumiOpen org

that's very troubling. I'll look at this some more today.

LumiOpen org

Please try again!

Sign up or log in to comment