Sentence Similarity
Transformers
Safetensors
English
llama
feature-extraction
text-embedding
embeddings
information-retrieval
beir
text-classification
language-model
text-clustering
text-semantic-similarity
text-evaluation
text-reranking
Sentence Similarity
natural_questions
ms_marco
fever
hotpot_qa
mteb
custom_code
text-generation-inference
Inference Endpoints
Can't load the model
#2
by
Ming369
- opened
In config.json file, there is: "architectures": [
"LlamaEncoderModel"
],
"auto_map": {
"AutoModel": "McGill-NLP/LLM2Vec-Meta-Llama-31-8B-Instruct-mntp--modeling_llama_encoder.LlamaEncoderModel"
}
But in modeling_llama_encoder.py, the model name is "BidirectionalLlama", not "LlamaEncoderModel". So when I tried to load the model, such error occurred:AttributeError: module 'transformers_modules.McGill-NLP.LLM2Vec-Meta-Llama-31-8B-Instruct-mntp.1d49bff4203a867109580085c67e3b3cc2984a89.modeling_llama_encoder' has no attribute 'LlamaEncoderModel'
Thanks for bringing this to my attention. I have fixed this. Can you check now?
Thank you for fixing this. I can load the model correctly now :)
Ming369
changed discussion status to
closed