shibing624 commited on
Commit
81be5b0
1 Parent(s): 53fc144

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -11
README.md CHANGED
@@ -7,11 +7,11 @@ tags:
7
  - sentence-similarity
8
  - transformers
9
  datasets:
10
- - shibing624/nli_zh
11
  language:
12
  - zh
13
  metrics:
14
- - bleu
15
  library_name: transformers
16
  ---
17
  # shibing624/text2vec-base-chinese-sentence
@@ -20,23 +20,28 @@ This is a CoSENT(Cosine Sentence) model: shibing624/text2vec-base-chinese-senten
20
  It maps sentences to a 768 dimensional dense vector space and can be used for tasks
21
  like sentence embeddings, text matching or semantic search.
22
 
23
- - using all 5 tasks' datasets, dataset: https://huggingface.co/datasets/shibing624/nli_zh
24
  - base model: nghuyong/ernie-3.0-base-zh
25
- - max_seq_length = 256
26
  - best epoch: 3
 
27
 
28
  ## Evaluation
29
  For an automated evaluation of this model, see the *Evaluation Benchmark*: [text2vec](https://github.com/shibing624/text2vec)
30
 
 
31
  - 本项目release模型的中文匹配评测结果:
32
 
33
- | Arch | BaseModel | Model | ATEC | BQ | LCQMC | PAWSX | STS-B | Avg | QPS |
34
- | :-- |:-----------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------|:-----:|:-----:|:-----:|:-----:|:-----:|:---------:|:-----:|
35
- | Word2Vec | word2vec | [w2v-light-tencent-chinese](https://ai.tencent.com/ailab/nlp/en/download.html) | 20.00 | 31.49 | 59.46 | 2.57 | 55.78 | 33.86 | 23769 |
36
- | SBERT | xlm-roberta-base | [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) | 18.42 | 38.52 | 63.96 | 10.14 | 78.90 | 41.99 | 3138 |
37
- | CoSENT | hfl/chinese-macbert-base | [shibing624/text2vec-base-chinese](https://huggingface.co/shibing624/text2vec-base-chinese) | 31.93 | 42.67 | 70.16 | 17.21 | 79.30 | 48.25 | 3008 |
38
- | CoSENT | hfl/chinese-lert-large | [GanymedeNil/text2vec-large-chinese](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 32.61 | 44.59 | 69.30 | 14.51 | 79.44 | 48.08 | 2092 |
39
- | CoSENT | nghuyong/ernie-3.0-base-zh | [shibing624/text2vec-base-chinese-sentence](https://huggingface.co/shibing624/text2vec-base-chinese-sentence) | 51.26 | 68.72 | 79.13 | 34.28 | 80.70 | **62.81** | 3066 |
 
 
 
40
 
41
 
42
  ## Usage (text2vec)
 
7
  - sentence-similarity
8
  - transformers
9
  datasets:
10
+ - https://huggingface.co/datasets/shibing624/nli-zh-all/tree/main/text2vec-base-chinese-sentence-dataset
11
  language:
12
  - zh
13
  metrics:
14
+ - spearmanr
15
  library_name: transformers
16
  ---
17
  # shibing624/text2vec-base-chinese-sentence
 
20
  It maps sentences to a 768 dimensional dense vector space and can be used for tasks
21
  like sentence embeddings, text matching or semantic search.
22
 
23
+ - training dataset: https://huggingface.co/datasets/shibing624/nli-zh-all/tree/main/text2vec-base-chinese-sentence-dataset
24
  - base model: nghuyong/ernie-3.0-base-zh
25
+ - max_seq_length: 256
26
  - best epoch: 3
27
+ - sentence embedding dim: 768
28
 
29
  ## Evaluation
30
  For an automated evaluation of this model, see the *Evaluation Benchmark*: [text2vec](https://github.com/shibing624/text2vec)
31
 
32
+ ### Release Models
33
  - 本项目release模型的中文匹配评测结果:
34
 
35
+ | Arch | BaseModel | Model | ATEC | BQ | LCQMC | PAWSX | STS-B | SOHU-dd | SOHU-dc | Avg | QPS |
36
+ |:-----------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------|:-----:|:-----:|:-----:|:-----:|:-----:|:-------:|:-------:|:---------:|:-----:|
37
+ | Word2Vec | word2vec | [w2v-light-tencent-chinese](https://ai.tencent.com/ailab/nlp/en/download.html) | 20.00 | 31.49 | 59.46 | 2.57 | 55.78 | 55.04 | 20.70 | 35.03 | 23769 |
38
+ | SBERT | xlm-roberta-base | [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) | 18.42 | 38.52 | 63.96 | 10.14 | 78.90 | 63.01 | 52.28 | 46.46 | 3138 |
39
+ | Instructor | hfl/chinese-roberta-wwm-ext | [moka-ai/m3e-base](https://huggingface.co/moka-ai/m3e-base) | 41.27 | 63.81 | 74.87 | 12.20 | 76.96 | 75.83 | 60.55 | 57.93 | 2980 |
40
+ | CoSENT | hfl/chinese-macbert-base | [shibing624/text2vec-base-chinese](https://huggingface.co/shibing624/text2vec-base-chinese) | 31.93 | 42.67 | 70.16 | 17.21 | 79.30 | 70.27 | 50.42 | 51.61 | 3008 |
41
+ | CoSENT | hfl/chinese-lert-large | [GanymedeNil/text2vec-large-chinese](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 32.61 | 44.59 | 69.30 | 14.51 | 79.44 | 73.01 | 59.04 | 53.12 | 2092 |
42
+ | CoSENT | nghuyong/ernie-3.0-base-zh | [shibing624/text2vec-base-chinese-sentence](https://huggingface.co/shibing624/text2vec-base-chinese-sentence) | 51.26 | 68.72 | 79.13 | 34.28 | 80.70 | 70.34 | 54.91 | 60.09 | 3066 |
43
+ | CoSENT | nghuyong/ernie-3.0-base-zh | [shibing624/text2vec-base-chinese-paraphrase](https://huggingface.co/shibing624/text2vec-base-chinese-paraphrase) | 44.89 | 63.58 | 74.24 | 40.90 | 78.93 | 76.70 | 63.30 | **63.08** | 3066 |
44
+
45
 
46
 
47
  ## Usage (text2vec)