Error deploying on sagemaker inference endpoint
Hi,
I'm trying to deploy the model for inference in sagemaker using this config:
hub = {
'HF_MODEL_ID':'OrdalieTech/Solon-embeddings-large-0.1',
'HF_TASK':'sentiment-analysis'
}
huggingface_model = HuggingFaceModel(
transformers_version='4.26.0',
pytorch_version='1.13.1',
py_version='py39',
env=hub,
role=role,
)
It appears the transformers version might not be the right one as I get the following error on model load:
"Could not load model /.sagemaker/mms/models/OrdalieTech__Solon-embeddings-large-0.1 with any of the following classes: (\u003cclass \u0027transformers.models.auto.modeling_auto.AutoModelForSequenceClassification\u0027\u003e, \u003cclass \u0027transformers.models.xlm_roberta.modeling_xlm_roberta.XLMRobertaModel\u0027\u003e).
Do you have any suggestion on the version I should use ?
Many thanks !
Thanks for your message — I'm not sure but first the HF_TASK should be feature_extraction and not sentiment-analysis.
AutoModel should work fine.