Error when attempting to load model (embed-m-long only)
#5
by
minimaxir
- opened
Just from following the demo:
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained('Snowflake/snowflake-arctic-embed-m-long', trust_remote_code=True, add_pooling_layer=False)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 558, in from_pretrained
return model_class.from_pretrained(
File "/home/jupyter/.cache/huggingface/modules/transformers_modules/Snowflake/snowflake-arctic-embed-m-long/65cda794702245926a85d939971c4d2864419849/modeling_hf_nomic_bert.py", line 345, in from_pretrained
state_dict = state_dict_from_pretrained(model_name, safe_serialization=kwargs.get("safe_serialization", False))
File "/home/jupyter/.cache/huggingface/modules/transformers_modules/Snowflake/snowflake-arctic-embed-m-long/65cda794702245926a85d939971c4d2864419849/modeling_hf_nomic_bert.py", line 73, in state_dict_from_pretrained
raise EnvironmentError(f"Model name {model_name} was not found.")
OSError: Model name Snowflake/snowflake-arctic-embed-m-long was not found.
This does not occur with the other Snowflake embed models release, which makes sense since the culprit is in modeling_hf_nomic_bert.py
Hi @minimaxir
from transformers import AutoModel
model = AutoModel.from_pretrained('Snowflake/snowflake-arctic-embed-m-long', trust_remote_code=True, add_pooling_layer=False, safe_serialization=True)
this is the correct way to load the long context model. sorry about the confusion. we will make sure documentation stays up to date. please let us know if there is any other issue!
spacemanidol
changed discussion status to
closed
Yep, works now with the added parameter. Thanks!