a problem with embed_out.bias(model-00008 of 00008)

#2
by saeedfarzi - opened

Hi guys,
I am going to covert the model's weight to litgpt format. But I have received the same problem while converting. I think there is a something wrong with model-00008 of 00008 and needs to be fixed.

I copy the error message here in the following:

Processing model-00008-of-00008.bin

Traceback (most recent call last):
File "/home/.../miniconda3/envs/llama3_1/bin/litgpt", line 8, in
sys.exit(main())

File "/home/..../miniconda3/envs/llama3_1/lib/python3.12/site-packages/litgpt/main.py", line 143, in main
fn(**kwargs)
File "/home/..../miniconda3/envs/llama3_1/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)

File "/home/..../miniconda3/envs/llama3_1/lib/python3.12/site-packages/litgpt/scripts/convert_hf_checkpoint.py", line 348, in convert_hf_checkpoint
copy_fn(sd, hf_weights, saver=saver, dtype=dtype)
File "/home/..../miniconda3/envs/llama3_1/lib/python3.12/site-packages/litgpt/scripts/convert_hf_checkpoint.py", line 53, in copy_weights_gpt_neox
to_name = weight_map[name]

KeyError: 'embed_out.bias'

Sign up or log in to comment