alime-embedding-large-zh
The alime embedding model.
Usage (Sentence-Transformers)
Using this model becomes easy when you have sentence-transformers installed:
pip install -U sentence-transformers
Then you can use the model like this:
from sentence_transformers import SentenceTransformer
sentences = ["西湖在哪?", "西湖风景名胜区位于浙江省杭州市"]
model = SentenceTransformer('Pristinenlp/alime-embedding-large-zh')
embeddings = model.encode(sentences, normalize_embeddings=True)
print(embeddings)
- Downloads last month
- 181
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Spaces using Pristinenlp/alime-embedding-large-zh 2
Evaluation results
- cos_sim_pearson on MTEB AFQMCvalidation set self-reported49.648
- cos_sim_spearman on MTEB AFQMCvalidation set self-reported54.733
- euclidean_pearson on MTEB AFQMCvalidation set self-reported53.063
- euclidean_spearman on MTEB AFQMCvalidation set self-reported54.733
- manhattan_pearson on MTEB AFQMCvalidation set self-reported53.048
- manhattan_spearman on MTEB AFQMCvalidation set self-reported54.729
- cos_sim_pearson on MTEB ATECtest set self-reported48.659
- cos_sim_spearman on MTEB ATECtest set self-reported55.125
- euclidean_pearson on MTEB ATECtest set self-reported55.734
- euclidean_spearman on MTEB ATECtest set self-reported55.125