bwang0911 commited on
Commit
770dc05
1 Parent(s): 5443de4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1699 -0
README.md CHANGED
@@ -2,12 +2,1711 @@
2
  pipeline_tag: sentence-similarity
3
  tags:
4
  - finetuner
 
5
  - feature-extraction
6
  - sentence-similarity
7
  datasets:
8
  - jinaai/negation-dataset
9
  language: en
10
  license: apache-2.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
  ---
12
 
13
  <br><br>
 
2
  pipeline_tag: sentence-similarity
3
  tags:
4
  - finetuner
5
+ - mteb
6
  - feature-extraction
7
  - sentence-similarity
8
  datasets:
9
  - jinaai/negation-dataset
10
  language: en
11
  license: apache-2.0
12
+ model-index:
13
+ - name: jina-embedding-l-en-v1
14
+ results:
15
+ - task:
16
+ type: Classification
17
+ dataset:
18
+ type: mteb/amazon_counterfactual
19
+ name: MTEB AmazonCounterfactualClassification (en)
20
+ config: en
21
+ split: test
22
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
23
+ metrics:
24
+ - type: accuracy
25
+ value: 61.64179104477612
26
+ - type: ap
27
+ value: 24.63675721041911
28
+ - type: f1
29
+ value: 55.10036810049116
30
+ - task:
31
+ type: Classification
32
+ dataset:
33
+ type: mteb/amazon_polarity
34
+ name: MTEB AmazonPolarityClassification
35
+ config: default
36
+ split: test
37
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
38
+ metrics:
39
+ - type: accuracy
40
+ value: 60.708125
41
+ - type: ap
42
+ value: 57.491681452557344
43
+ - type: f1
44
+ value: 58.046023443205655
45
+ - task:
46
+ type: Classification
47
+ dataset:
48
+ type: mteb/amazon_reviews_multi
49
+ name: MTEB AmazonReviewsClassification (en)
50
+ config: en
51
+ split: test
52
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
53
+ metrics:
54
+ - type: accuracy
55
+ value: 28.12
56
+ - type: f1
57
+ value: 26.904734434317966
58
+ - task:
59
+ type: Retrieval
60
+ dataset:
61
+ type: arguana
62
+ name: MTEB ArguAna
63
+ config: default
64
+ split: test
65
+ revision: None
66
+ metrics:
67
+ - type: map_at_1
68
+ value: 26.031
69
+ - type: map_at_10
70
+ value: 40.742
71
+ - type: map_at_100
72
+ value: 41.832
73
+ - type: map_at_1000
74
+ value: 41.844
75
+ - type: map_at_3
76
+ value: 35.526
77
+ - type: map_at_5
78
+ value: 38.567
79
+ - type: mrr_at_1
80
+ value: 26.316
81
+ - type: mrr_at_10
82
+ value: 40.855999999999995
83
+ - type: mrr_at_100
84
+ value: 41.946
85
+ - type: mrr_at_1000
86
+ value: 41.957
87
+ - type: mrr_at_3
88
+ value: 35.621
89
+ - type: mrr_at_5
90
+ value: 38.644
91
+ - type: ndcg_at_1
92
+ value: 26.031
93
+ - type: ndcg_at_10
94
+ value: 49.483
95
+ - type: ndcg_at_100
96
+ value: 54.074999999999996
97
+ - type: ndcg_at_1000
98
+ value: 54.344
99
+ - type: ndcg_at_3
100
+ value: 38.792
101
+ - type: ndcg_at_5
102
+ value: 44.24
103
+ - type: precision_at_1
104
+ value: 26.031
105
+ - type: precision_at_10
106
+ value: 7.76
107
+ - type: precision_at_100
108
+ value: 0.975
109
+ - type: precision_at_1000
110
+ value: 0.1
111
+ - type: precision_at_3
112
+ value: 16.098000000000003
113
+ - type: precision_at_5
114
+ value: 12.29
115
+ - type: recall_at_1
116
+ value: 26.031
117
+ - type: recall_at_10
118
+ value: 77.596
119
+ - type: recall_at_100
120
+ value: 97.51100000000001
121
+ - type: recall_at_1000
122
+ value: 99.57300000000001
123
+ - type: recall_at_3
124
+ value: 48.293
125
+ - type: recall_at_5
126
+ value: 61.451
127
+ - task:
128
+ type: Clustering
129
+ dataset:
130
+ type: mteb/arxiv-clustering-p2p
131
+ name: MTEB ArxivClusteringP2P
132
+ config: default
133
+ split: test
134
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
135
+ metrics:
136
+ - type: v_measure
137
+ value: 41.76036539849672
138
+ - task:
139
+ type: Clustering
140
+ dataset:
141
+ type: mteb/arxiv-clustering-s2s
142
+ name: MTEB ArxivClusteringS2S
143
+ config: default
144
+ split: test
145
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
146
+ metrics:
147
+ - type: v_measure
148
+ value: 34.27585676831497
149
+ - task:
150
+ type: Reranking
151
+ dataset:
152
+ type: mteb/askubuntudupquestions-reranking
153
+ name: MTEB AskUbuntuDupQuestions
154
+ config: default
155
+ split: test
156
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
157
+ metrics:
158
+ - type: map
159
+ value: 63.47328704612227
160
+ - type: mrr
161
+ value: 76.63182078002022
162
+ - task:
163
+ type: STS
164
+ dataset:
165
+ type: mteb/biosses-sts
166
+ name: MTEB BIOSSES
167
+ config: default
168
+ split: test
169
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
170
+ metrics:
171
+ - type: cos_sim_pearson
172
+ value: 87.42072640664271
173
+ - type: cos_sim_spearman
174
+ value: 84.31336692039407
175
+ - type: euclidean_pearson
176
+ value: 54.93250871487246
177
+ - type: euclidean_spearman
178
+ value: 55.91091252228738
179
+ - type: manhattan_pearson
180
+ value: 54.78812442894107
181
+ - type: manhattan_spearman
182
+ value: 55.35005636930548
183
+ - task:
184
+ type: Classification
185
+ dataset:
186
+ type: mteb/banking77
187
+ name: MTEB Banking77Classification
188
+ config: default
189
+ split: test
190
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
191
+ metrics:
192
+ - type: accuracy
193
+ value: 86.28896103896103
194
+ - type: f1
195
+ value: 86.23389676482913
196
+ - task:
197
+ type: Clustering
198
+ dataset:
199
+ type: mteb/biorxiv-clustering-p2p
200
+ name: MTEB BiorxivClusteringP2P
201
+ config: default
202
+ split: test
203
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
204
+ metrics:
205
+ - type: v_measure
206
+ value: 33.73729294301578
207
+ - task:
208
+ type: Clustering
209
+ dataset:
210
+ type: mteb/biorxiv-clustering-s2s
211
+ name: MTEB BiorxivClusteringS2S
212
+ config: default
213
+ split: test
214
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
215
+ metrics:
216
+ - type: v_measure
217
+ value: 30.641078215958288
218
+ - task:
219
+ type: Retrieval
220
+ dataset:
221
+ type: climate-fever
222
+ name: MTEB ClimateFEVER
223
+ config: default
224
+ split: test
225
+ revision: None
226
+ metrics:
227
+ - type: map_at_1
228
+ value: 8.258000000000001
229
+ - type: map_at_10
230
+ value: 14.57
231
+ - type: map_at_100
232
+ value: 15.98
233
+ - type: map_at_1000
234
+ value: 16.149
235
+ - type: map_at_3
236
+ value: 11.993
237
+ - type: map_at_5
238
+ value: 13.383000000000001
239
+ - type: mrr_at_1
240
+ value: 18.176000000000002
241
+ - type: mrr_at_10
242
+ value: 28.560000000000002
243
+ - type: mrr_at_100
244
+ value: 29.656
245
+ - type: mrr_at_1000
246
+ value: 29.709999999999997
247
+ - type: mrr_at_3
248
+ value: 25.255
249
+ - type: mrr_at_5
250
+ value: 27.128000000000004
251
+ - type: ndcg_at_1
252
+ value: 18.176000000000002
253
+ - type: ndcg_at_10
254
+ value: 21.36
255
+ - type: ndcg_at_100
256
+ value: 27.619
257
+ - type: ndcg_at_1000
258
+ value: 31.086000000000002
259
+ - type: ndcg_at_3
260
+ value: 16.701
261
+ - type: ndcg_at_5
262
+ value: 18.559
263
+ - type: precision_at_1
264
+ value: 18.176000000000002
265
+ - type: precision_at_10
266
+ value: 6.683999999999999
267
+ - type: precision_at_100
268
+ value: 1.3339999999999999
269
+ - type: precision_at_1000
270
+ value: 0.197
271
+ - type: precision_at_3
272
+ value: 12.269
273
+ - type: precision_at_5
274
+ value: 9.798
275
+ - type: recall_at_1
276
+ value: 8.258000000000001
277
+ - type: recall_at_10
278
+ value: 27.060000000000002
279
+ - type: recall_at_100
280
+ value: 48.833
281
+ - type: recall_at_1000
282
+ value: 68.636
283
+ - type: recall_at_3
284
+ value: 15.895999999999999
285
+ - type: recall_at_5
286
+ value: 20.625
287
+ - task:
288
+ type: Retrieval
289
+ dataset:
290
+ type: dbpedia-entity
291
+ name: MTEB DBPedia
292
+ config: default
293
+ split: test
294
+ revision: None
295
+ metrics:
296
+ - type: map_at_1
297
+ value: 8.241
298
+ - type: map_at_10
299
+ value: 17.141000000000002
300
+ - type: map_at_100
301
+ value: 22.805
302
+ - type: map_at_1000
303
+ value: 24.189
304
+ - type: map_at_3
305
+ value: 12.940999999999999
306
+ - type: map_at_5
307
+ value: 14.607000000000001
308
+ - type: mrr_at_1
309
+ value: 62.25000000000001
310
+ - type: mrr_at_10
311
+ value: 70.537
312
+ - type: mrr_at_100
313
+ value: 70.851
314
+ - type: mrr_at_1000
315
+ value: 70.875
316
+ - type: mrr_at_3
317
+ value: 68.75
318
+ - type: mrr_at_5
319
+ value: 69.77499999999999
320
+ - type: ndcg_at_1
321
+ value: 50.125
322
+ - type: ndcg_at_10
323
+ value: 36.032
324
+ - type: ndcg_at_100
325
+ value: 39.428999999999995
326
+ - type: ndcg_at_1000
327
+ value: 47.138999999999996
328
+ - type: ndcg_at_3
329
+ value: 40.99
330
+ - type: ndcg_at_5
331
+ value: 37.772
332
+ - type: precision_at_1
333
+ value: 62.25000000000001
334
+ - type: precision_at_10
335
+ value: 28.050000000000004
336
+ - type: precision_at_100
337
+ value: 8.527999999999999
338
+ - type: precision_at_1000
339
+ value: 1.82
340
+ - type: precision_at_3
341
+ value: 45.0
342
+ - type: precision_at_5
343
+ value: 36.0
344
+ - type: recall_at_1
345
+ value: 8.241
346
+ - type: recall_at_10
347
+ value: 22.583000000000002
348
+ - type: recall_at_100
349
+ value: 44.267
350
+ - type: recall_at_1000
351
+ value: 69.497
352
+ - type: recall_at_3
353
+ value: 14.326
354
+ - type: recall_at_5
355
+ value: 17.29
356
+ - task:
357
+ type: Classification
358
+ dataset:
359
+ type: mteb/emotion
360
+ name: MTEB EmotionClassification
361
+ config: default
362
+ split: test
363
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
364
+ metrics:
365
+ - type: accuracy
366
+ value: 42.295
367
+ - type: f1
368
+ value: 38.32403088027173
369
+ - task:
370
+ type: Retrieval
371
+ dataset:
372
+ type: fever
373
+ name: MTEB FEVER
374
+ config: default
375
+ split: test
376
+ revision: None
377
+ metrics:
378
+ - type: map_at_1
379
+ value: 58.553
380
+ - type: map_at_10
381
+ value: 69.632
382
+ - type: map_at_100
383
+ value: 69.95400000000001
384
+ - type: map_at_1000
385
+ value: 69.968
386
+ - type: map_at_3
387
+ value: 67.656
388
+ - type: map_at_5
389
+ value: 68.86
390
+ - type: mrr_at_1
391
+ value: 63.156
392
+ - type: mrr_at_10
393
+ value: 74.37700000000001
394
+ - type: mrr_at_100
395
+ value: 74.629
396
+ - type: mrr_at_1000
397
+ value: 74.63300000000001
398
+ - type: mrr_at_3
399
+ value: 72.577
400
+ - type: mrr_at_5
401
+ value: 73.71
402
+ - type: ndcg_at_1
403
+ value: 63.156
404
+ - type: ndcg_at_10
405
+ value: 75.345
406
+ - type: ndcg_at_100
407
+ value: 76.728
408
+ - type: ndcg_at_1000
409
+ value: 77.006
410
+ - type: ndcg_at_3
411
+ value: 71.67099999999999
412
+ - type: ndcg_at_5
413
+ value: 73.656
414
+ - type: precision_at_1
415
+ value: 63.156
416
+ - type: precision_at_10
417
+ value: 9.673
418
+ - type: precision_at_100
419
+ value: 1.045
420
+ - type: precision_at_1000
421
+ value: 0.108
422
+ - type: precision_at_3
423
+ value: 28.393
424
+ - type: precision_at_5
425
+ value: 18.160999999999998
426
+ - type: recall_at_1
427
+ value: 58.553
428
+ - type: recall_at_10
429
+ value: 88.362
430
+ - type: recall_at_100
431
+ value: 94.401
432
+ - type: recall_at_1000
433
+ value: 96.256
434
+ - type: recall_at_3
435
+ value: 78.371
436
+ - type: recall_at_5
437
+ value: 83.32300000000001
438
+ - task:
439
+ type: Retrieval
440
+ dataset:
441
+ type: fiqa
442
+ name: MTEB FiQA2018
443
+ config: default
444
+ split: test
445
+ revision: None
446
+ metrics:
447
+ - type: map_at_1
448
+ value: 19.302
449
+ - type: map_at_10
450
+ value: 31.887
451
+ - type: map_at_100
452
+ value: 33.727000000000004
453
+ - type: map_at_1000
454
+ value: 33.914
455
+ - type: map_at_3
456
+ value: 27.254
457
+ - type: map_at_5
458
+ value: 29.904999999999998
459
+ - type: mrr_at_1
460
+ value: 39.043
461
+ - type: mrr_at_10
462
+ value: 47.858000000000004
463
+ - type: mrr_at_100
464
+ value: 48.636
465
+ - type: mrr_at_1000
466
+ value: 48.677
467
+ - type: mrr_at_3
468
+ value: 45.062000000000005
469
+ - type: mrr_at_5
470
+ value: 46.775
471
+ - type: ndcg_at_1
472
+ value: 39.043
473
+ - type: ndcg_at_10
474
+ value: 39.899
475
+ - type: ndcg_at_100
476
+ value: 46.719
477
+ - type: ndcg_at_1000
478
+ value: 49.739
479
+ - type: ndcg_at_3
480
+ value: 35.666
481
+ - type: ndcg_at_5
482
+ value: 37.232
483
+ - type: precision_at_1
484
+ value: 39.043
485
+ - type: precision_at_10
486
+ value: 11.265
487
+ - type: precision_at_100
488
+ value: 1.864
489
+ - type: precision_at_1000
490
+ value: 0.23800000000000002
491
+ - type: precision_at_3
492
+ value: 24.227999999999998
493
+ - type: precision_at_5
494
+ value: 18.148
495
+ - type: recall_at_1
496
+ value: 19.302
497
+ - type: recall_at_10
498
+ value: 47.278
499
+ - type: recall_at_100
500
+ value: 72.648
501
+ - type: recall_at_1000
502
+ value: 90.793
503
+ - type: recall_at_3
504
+ value: 31.235000000000003
505
+ - type: recall_at_5
506
+ value: 38.603
507
+ - task:
508
+ type: Retrieval
509
+ dataset:
510
+ type: hotpotqa
511
+ name: MTEB HotpotQA
512
+ config: default
513
+ split: test
514
+ revision: None
515
+ metrics:
516
+ - type: map_at_1
517
+ value: 31.398
518
+ - type: map_at_10
519
+ value: 44.635000000000005
520
+ - type: map_at_100
521
+ value: 45.513
522
+ - type: map_at_1000
523
+ value: 45.595
524
+ - type: map_at_3
525
+ value: 41.894
526
+ - type: map_at_5
527
+ value: 43.514
528
+ - type: mrr_at_1
529
+ value: 62.795
530
+ - type: mrr_at_10
531
+ value: 70.001
532
+ - type: mrr_at_100
533
+ value: 70.378
534
+ - type: mrr_at_1000
535
+ value: 70.399
536
+ - type: mrr_at_3
537
+ value: 68.542
538
+ - type: mrr_at_5
539
+ value: 69.394
540
+ - type: ndcg_at_1
541
+ value: 62.795
542
+ - type: ndcg_at_10
543
+ value: 53.635
544
+ - type: ndcg_at_100
545
+ value: 57.05
546
+ - type: ndcg_at_1000
547
+ value: 58.755
548
+ - type: ndcg_at_3
549
+ value: 49.267
550
+ - type: ndcg_at_5
551
+ value: 51.522
552
+ - type: precision_at_1
553
+ value: 62.795
554
+ - type: precision_at_10
555
+ value: 11.196
556
+ - type: precision_at_100
557
+ value: 1.389
558
+ - type: precision_at_1000
559
+ value: 0.16199999999999998
560
+ - type: precision_at_3
561
+ value: 30.804
562
+ - type: precision_at_5
563
+ value: 20.265
564
+ - type: recall_at_1
565
+ value: 31.398
566
+ - type: recall_at_10
567
+ value: 55.982
568
+ - type: recall_at_100
569
+ value: 69.453
570
+ - type: recall_at_1000
571
+ value: 80.756
572
+ - type: recall_at_3
573
+ value: 46.205
574
+ - type: recall_at_5
575
+ value: 50.662
576
+ - task:
577
+ type: Classification
578
+ dataset:
579
+ type: mteb/imdb
580
+ name: MTEB ImdbClassification
581
+ config: default
582
+ split: test
583
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
584
+ metrics:
585
+ - type: accuracy
586
+ value: 63.803200000000004
587
+ - type: ap
588
+ value: 59.04397034963468
589
+ - type: f1
590
+ value: 63.4675375611795
591
+ - task:
592
+ type: Retrieval
593
+ dataset:
594
+ type: msmarco
595
+ name: MTEB MSMARCO
596
+ config: default
597
+ split: dev
598
+ revision: None
599
+ metrics:
600
+ - type: map_at_1
601
+ value: 17.671
602
+ - type: map_at_10
603
+ value: 29.152
604
+ - type: map_at_100
605
+ value: 30.422
606
+ - type: map_at_1000
607
+ value: 30.481
608
+ - type: map_at_3
609
+ value: 25.417
610
+ - type: map_at_5
611
+ value: 27.448
612
+ - type: mrr_at_1
613
+ value: 18.195
614
+ - type: mrr_at_10
615
+ value: 29.67
616
+ - type: mrr_at_100
617
+ value: 30.891999999999996
618
+ - type: mrr_at_1000
619
+ value: 30.944
620
+ - type: mrr_at_3
621
+ value: 25.974000000000004
622
+ - type: mrr_at_5
623
+ value: 27.996
624
+ - type: ndcg_at_1
625
+ value: 18.195
626
+ - type: ndcg_at_10
627
+ value: 35.795
628
+ - type: ndcg_at_100
629
+ value: 42.117
630
+ - type: ndcg_at_1000
631
+ value: 43.585
632
+ - type: ndcg_at_3
633
+ value: 28.122000000000003
634
+ - type: ndcg_at_5
635
+ value: 31.757
636
+ - type: precision_at_1
637
+ value: 18.195
638
+ - type: precision_at_10
639
+ value: 5.89
640
+ - type: precision_at_100
641
+ value: 0.9079999999999999
642
+ - type: precision_at_1000
643
+ value: 0.10300000000000001
644
+ - type: precision_at_3
645
+ value: 12.24
646
+ - type: precision_at_5
647
+ value: 9.178
648
+ - type: recall_at_1
649
+ value: 17.671
650
+ - type: recall_at_10
651
+ value: 56.373
652
+ - type: recall_at_100
653
+ value: 86.029
654
+ - type: recall_at_1000
655
+ value: 97.246
656
+ - type: recall_at_3
657
+ value: 35.414
658
+ - type: recall_at_5
659
+ value: 44.149
660
+ - task:
661
+ type: Classification
662
+ dataset:
663
+ type: mteb/mtop_domain
664
+ name: MTEB MTOPDomainClassification (en)
665
+ config: en
666
+ split: test
667
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
668
+ metrics:
669
+ - type: accuracy
670
+ value: 90.80255357957135
671
+ - type: f1
672
+ value: 90.79256308087807
673
+ - task:
674
+ type: Classification
675
+ dataset:
676
+ type: mteb/mtop_intent
677
+ name: MTEB MTOPIntentClassification (en)
678
+ config: en
679
+ split: test
680
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
681
+ metrics:
682
+ - type: accuracy
683
+ value: 71.20611035111719
684
+ - type: f1
685
+ value: 54.075483897190836
686
+ - task:
687
+ type: Classification
688
+ dataset:
689
+ type: mteb/amazon_massive_intent
690
+ name: MTEB MassiveIntentClassification (en)
691
+ config: en
692
+ split: test
693
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
694
+ metrics:
695
+ - type: accuracy
696
+ value: 70.79354404841965
697
+ - type: f1
698
+ value: 68.53816551555609
699
+ - task:
700
+ type: Classification
701
+ dataset:
702
+ type: mteb/amazon_massive_scenario
703
+ name: MTEB MassiveScenarioClassification (en)
704
+ config: en
705
+ split: test
706
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
707
+ metrics:
708
+ - type: accuracy
709
+ value: 76.6072629455279
710
+ - type: f1
711
+ value: 77.04997715738867
712
+ - task:
713
+ type: Clustering
714
+ dataset:
715
+ type: mteb/medrxiv-clustering-p2p
716
+ name: MTEB MedrxivClusteringP2P
717
+ config: default
718
+ split: test
719
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
720
+ metrics:
721
+ - type: v_measure
722
+ value: 30.432745003633016
723
+ - task:
724
+ type: Clustering
725
+ dataset:
726
+ type: mteb/medrxiv-clustering-s2s
727
+ name: MTEB MedrxivClusteringS2S
728
+ config: default
729
+ split: test
730
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
731
+ metrics:
732
+ - type: v_measure
733
+ value: 28.95493811839366
734
+ - task:
735
+ type: Reranking
736
+ dataset:
737
+ type: mteb/mind_small
738
+ name: MTEB MindSmallReranking
739
+ config: default
740
+ split: test
741
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
742
+ metrics:
743
+ - type: map
744
+ value: 31.63516074152514
745
+ - type: mrr
746
+ value: 32.73091425241894
747
+ - task:
748
+ type: Retrieval
749
+ dataset:
750
+ type: nfcorpus
751
+ name: MTEB NFCorpus
752
+ config: default
753
+ split: test
754
+ revision: None
755
+ metrics:
756
+ - type: map_at_1
757
+ value: 5.379
758
+ - type: map_at_10
759
+ value: 12.051
760
+ - type: map_at_100
761
+ value: 15.176
762
+ - type: map_at_1000
763
+ value: 16.662
764
+ - type: map_at_3
765
+ value: 8.588
766
+ - type: map_at_5
767
+ value: 10.274
768
+ - type: mrr_at_1
769
+ value: 44.891999999999996
770
+ - type: mrr_at_10
771
+ value: 53.06999999999999
772
+ - type: mrr_at_100
773
+ value: 53.675
774
+ - type: mrr_at_1000
775
+ value: 53.717999999999996
776
+ - type: mrr_at_3
777
+ value: 50.671
778
+ - type: mrr_at_5
779
+ value: 52.25
780
+ - type: ndcg_at_1
781
+ value: 42.879
782
+ - type: ndcg_at_10
783
+ value: 33.291
784
+ - type: ndcg_at_100
785
+ value: 30.567
786
+ - type: ndcg_at_1000
787
+ value: 39.598
788
+ - type: ndcg_at_3
789
+ value: 37.713
790
+ - type: ndcg_at_5
791
+ value: 36.185
792
+ - type: precision_at_1
793
+ value: 44.891999999999996
794
+ - type: precision_at_10
795
+ value: 24.923000000000002
796
+ - type: precision_at_100
797
+ value: 8.015
798
+ - type: precision_at_1000
799
+ value: 2.083
800
+ - type: precision_at_3
801
+ value: 35.088
802
+ - type: precision_at_5
803
+ value: 31.765
804
+ - type: recall_at_1
805
+ value: 5.379
806
+ - type: recall_at_10
807
+ value: 16.346
808
+ - type: recall_at_100
809
+ value: 31.887999999999998
810
+ - type: recall_at_1000
811
+ value: 64.90599999999999
812
+ - type: recall_at_3
813
+ value: 9.543
814
+ - type: recall_at_5
815
+ value: 12.369
816
+ - task:
817
+ type: Retrieval
818
+ dataset:
819
+ type: nq
820
+ name: MTEB NQ
821
+ config: default
822
+ split: test
823
+ revision: None
824
+ metrics:
825
+ - type: map_at_1
826
+ value: 25.654
827
+ - type: map_at_10
828
+ value: 40.163
829
+ - type: map_at_100
830
+ value: 41.376000000000005
831
+ - type: map_at_1000
832
+ value: 41.411
833
+ - type: map_at_3
834
+ value: 35.677
835
+ - type: map_at_5
836
+ value: 38.238
837
+ - type: mrr_at_1
838
+ value: 29.055999999999997
839
+ - type: mrr_at_10
840
+ value: 42.571999999999996
841
+ - type: mrr_at_100
842
+ value: 43.501
843
+ - type: mrr_at_1000
844
+ value: 43.527
845
+ - type: mrr_at_3
846
+ value: 38.775
847
+ - type: mrr_at_5
848
+ value: 40.953
849
+ - type: ndcg_at_1
850
+ value: 29.026999999999997
851
+ - type: ndcg_at_10
852
+ value: 47.900999999999996
853
+ - type: ndcg_at_100
854
+ value: 52.941
855
+ - type: ndcg_at_1000
856
+ value: 53.786
857
+ - type: ndcg_at_3
858
+ value: 39.387
859
+ - type: ndcg_at_5
860
+ value: 43.65
861
+ - type: precision_at_1
862
+ value: 29.026999999999997
863
+ - type: precision_at_10
864
+ value: 8.247
865
+ - type: precision_at_100
866
+ value: 1.102
867
+ - type: precision_at_1000
868
+ value: 0.11800000000000001
869
+ - type: precision_at_3
870
+ value: 18.231
871
+ - type: precision_at_5
872
+ value: 13.378
873
+ - type: recall_at_1
874
+ value: 25.654
875
+ - type: recall_at_10
876
+ value: 69.175
877
+ - type: recall_at_100
878
+ value: 90.85600000000001
879
+ - type: recall_at_1000
880
+ value: 97.18
881
+ - type: recall_at_3
882
+ value: 47.043
883
+ - type: recall_at_5
884
+ value: 56.86600000000001
885
+ - task:
886
+ type: Retrieval
887
+ dataset:
888
+ type: quora
889
+ name: MTEB QuoraRetrieval
890
+ config: default
891
+ split: test
892
+ revision: None
893
+ metrics:
894
+ - type: map_at_1
895
+ value: 70.785
896
+ - type: map_at_10
897
+ value: 84.509
898
+ - type: map_at_100
899
+ value: 85.17
900
+ - type: map_at_1000
901
+ value: 85.187
902
+ - type: map_at_3
903
+ value: 81.628
904
+ - type: map_at_5
905
+ value: 83.422
906
+ - type: mrr_at_1
907
+ value: 81.43
908
+ - type: mrr_at_10
909
+ value: 87.506
910
+ - type: mrr_at_100
911
+ value: 87.616
912
+ - type: mrr_at_1000
913
+ value: 87.617
914
+ - type: mrr_at_3
915
+ value: 86.598
916
+ - type: mrr_at_5
917
+ value: 87.215
918
+ - type: ndcg_at_1
919
+ value: 81.44
920
+ - type: ndcg_at_10
921
+ value: 88.208
922
+ - type: ndcg_at_100
923
+ value: 89.49000000000001
924
+ - type: ndcg_at_1000
925
+ value: 89.59700000000001
926
+ - type: ndcg_at_3
927
+ value: 85.471
928
+ - type: ndcg_at_5
929
+ value: 86.955
930
+ - type: precision_at_1
931
+ value: 81.44
932
+ - type: precision_at_10
933
+ value: 13.347000000000001
934
+ - type: precision_at_100
935
+ value: 1.53
936
+ - type: precision_at_1000
937
+ value: 0.157
938
+ - type: precision_at_3
939
+ value: 37.330000000000005
940
+ - type: precision_at_5
941
+ value: 24.506
942
+ - type: recall_at_1
943
+ value: 70.785
944
+ - type: recall_at_10
945
+ value: 95.15
946
+ - type: recall_at_100
947
+ value: 99.502
948
+ - type: recall_at_1000
949
+ value: 99.993
950
+ - type: recall_at_3
951
+ value: 87.234
952
+ - type: recall_at_5
953
+ value: 91.467
954
+ - task:
955
+ type: Clustering
956
+ dataset:
957
+ type: mteb/reddit-clustering
958
+ name: MTEB RedditClustering
959
+ config: default
960
+ split: test
961
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
962
+ metrics:
963
+ - type: v_measure
964
+ value: 52.40682777853522
965
+ - task:
966
+ type: Clustering
967
+ dataset:
968
+ type: mteb/reddit-clustering-p2p
969
+ name: MTEB RedditClusteringP2P
970
+ config: default
971
+ split: test
972
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
973
+ metrics:
974
+ - type: v_measure
975
+ value: 56.61834429208595
976
+ - task:
977
+ type: Retrieval
978
+ dataset:
979
+ type: scidocs
980
+ name: MTEB SCIDOCS
981
+ config: default
982
+ split: test
983
+ revision: None
984
+ metrics:
985
+ - type: map_at_1
986
+ value: 4.918
987
+ - type: map_at_10
988
+ value: 11.562
989
+ - type: map_at_100
990
+ value: 13.636999999999999
991
+ - type: map_at_1000
992
+ value: 13.918
993
+ - type: map_at_3
994
+ value: 8.353
995
+ - type: map_at_5
996
+ value: 9.878
997
+ - type: mrr_at_1
998
+ value: 24.3
999
+ - type: mrr_at_10
1000
+ value: 33.914
1001
+ - type: mrr_at_100
1002
+ value: 35.079
1003
+ - type: mrr_at_1000
1004
+ value: 35.134
1005
+ - type: mrr_at_3
1006
+ value: 30.833
1007
+ - type: mrr_at_5
1008
+ value: 32.528
1009
+ - type: ndcg_at_1
1010
+ value: 24.3
1011
+ - type: ndcg_at_10
1012
+ value: 19.393
1013
+ - type: ndcg_at_100
1014
+ value: 27.471
1015
+ - type: ndcg_at_1000
1016
+ value: 32.543
1017
+ - type: ndcg_at_3
1018
+ value: 18.648
1019
+ - type: ndcg_at_5
1020
+ value: 16.064999999999998
1021
+ - type: precision_at_1
1022
+ value: 24.3
1023
+ - type: precision_at_10
1024
+ value: 9.92
1025
+ - type: precision_at_100
1026
+ value: 2.152
1027
+ - type: precision_at_1000
1028
+ value: 0.338
1029
+ - type: precision_at_3
1030
+ value: 17.1
1031
+ - type: precision_at_5
1032
+ value: 13.819999999999999
1033
+ - type: recall_at_1
1034
+ value: 4.918
1035
+ - type: recall_at_10
1036
+ value: 20.102
1037
+ - type: recall_at_100
1038
+ value: 43.69
1039
+ - type: recall_at_1000
1040
+ value: 68.568
1041
+ - type: recall_at_3
1042
+ value: 10.383000000000001
1043
+ - type: recall_at_5
1044
+ value: 13.977999999999998
1045
+ - task:
1046
+ type: STS
1047
+ dataset:
1048
+ type: mteb/sickr-sts
1049
+ name: MTEB SICK-R
1050
+ config: default
1051
+ split: test
1052
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1053
+ metrics:
1054
+ - type: cos_sim_pearson
1055
+ value: 86.02374279770862
1056
+ - type: cos_sim_spearman
1057
+ value: 80.3123278821752
1058
+ - type: euclidean_pearson
1059
+ value: 78.150387301923
1060
+ - type: euclidean_spearman
1061
+ value: 74.27020095240543
1062
+ - type: manhattan_pearson
1063
+ value: 78.00212720962597
1064
+ - type: manhattan_spearman
1065
+ value: 74.27996355049189
1066
+ - task:
1067
+ type: STS
1068
+ dataset:
1069
+ type: mteb/sts12-sts
1070
+ name: MTEB STS12
1071
+ config: default
1072
+ split: test
1073
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1074
+ metrics:
1075
+ - type: cos_sim_pearson
1076
+ value: 83.56832604166104
1077
+ - type: cos_sim_spearman
1078
+ value: 73.85172437109456
1079
+ - type: euclidean_pearson
1080
+ value: 70.77037821156355
1081
+ - type: euclidean_spearman
1082
+ value: 58.32603602271459
1083
+ - type: manhattan_pearson
1084
+ value: 70.6019035905572
1085
+ - type: manhattan_spearman
1086
+ value: 58.18758998109944
1087
+ - task:
1088
+ type: STS
1089
+ dataset:
1090
+ type: mteb/sts13-sts
1091
+ name: MTEB STS13
1092
+ config: default
1093
+ split: test
1094
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1095
+ metrics:
1096
+ - type: cos_sim_pearson
1097
+ value: 83.97624603590171
1098
+ - type: cos_sim_spearman
1099
+ value: 84.3654403570941
1100
+ - type: euclidean_pearson
1101
+ value: 77.37734191552401
1102
+ - type: euclidean_spearman
1103
+ value: 77.83492278107906
1104
+ - type: manhattan_pearson
1105
+ value: 77.38406845115612
1106
+ - type: manhattan_spearman
1107
+ value: 77.80429501178632
1108
+ - task:
1109
+ type: STS
1110
+ dataset:
1111
+ type: mteb/sts14-sts
1112
+ name: MTEB STS14
1113
+ config: default
1114
+ split: test
1115
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1116
+ metrics:
1117
+ - type: cos_sim_pearson
1118
+ value: 82.5175806484823
1119
+ - type: cos_sim_spearman
1120
+ value: 77.84074419393815
1121
+ - type: euclidean_pearson
1122
+ value: 75.31514179994578
1123
+ - type: euclidean_spearman
1124
+ value: 71.06564963155697
1125
+ - type: manhattan_pearson
1126
+ value: 75.25016497298036
1127
+ - type: manhattan_spearman
1128
+ value: 71.0503867625097
1129
+ - task:
1130
+ type: STS
1131
+ dataset:
1132
+ type: mteb/sts15-sts
1133
+ name: MTEB STS15
1134
+ config: default
1135
+ split: test
1136
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1137
+ metrics:
1138
+ - type: cos_sim_pearson
1139
+ value: 85.15312065200007
1140
+ - type: cos_sim_spearman
1141
+ value: 86.28786282283781
1142
+ - type: euclidean_pearson
1143
+ value: 69.93961446583728
1144
+ - type: euclidean_spearman
1145
+ value: 70.99565144007187
1146
+ - type: manhattan_pearson
1147
+ value: 70.06338127800244
1148
+ - type: manhattan_spearman
1149
+ value: 71.15328825585216
1150
+ - task:
1151
+ type: STS
1152
+ dataset:
1153
+ type: mteb/sts16-sts
1154
+ name: MTEB STS16
1155
+ config: default
1156
+ split: test
1157
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1158
+ metrics:
1159
+ - type: cos_sim_pearson
1160
+ value: 80.48261723093232
1161
+ - type: cos_sim_spearman
1162
+ value: 82.13997187275378
1163
+ - type: euclidean_pearson
1164
+ value: 72.01034058956992
1165
+ - type: euclidean_spearman
1166
+ value: 72.90423890320797
1167
+ - type: manhattan_pearson
1168
+ value: 71.91819389305805
1169
+ - type: manhattan_spearman
1170
+ value: 72.804333901611
1171
+ - task:
1172
+ type: STS
1173
+ dataset:
1174
+ type: mteb/sts17-crosslingual-sts
1175
+ name: MTEB STS17 (en-en)
1176
+ config: en-en
1177
+ split: test
1178
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1179
+ metrics:
1180
+ - type: cos_sim_pearson
1181
+ value: 89.89094326696411
1182
+ - type: cos_sim_spearman
1183
+ value: 89.5679328484923
1184
+ - type: euclidean_pearson
1185
+ value: 77.27326226557433
1186
+ - type: euclidean_spearman
1187
+ value: 75.44670270858582
1188
+ - type: manhattan_pearson
1189
+ value: 77.49623029933024
1190
+ - type: manhattan_spearman
1191
+ value: 75.6317127686177
1192
+ - task:
1193
+ type: STS
1194
+ dataset:
1195
+ type: mteb/sts22-crosslingual-sts
1196
+ name: MTEB STS22 (en)
1197
+ config: en
1198
+ split: test
1199
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
1200
+ metrics:
1201
+ - type: cos_sim_pearson
1202
+ value: 67.03259798800852
1203
+ - type: cos_sim_spearman
1204
+ value: 66.17683868865686
1205
+ - type: euclidean_pearson
1206
+ value: 49.154524473561416
1207
+ - type: euclidean_spearman
1208
+ value: 58.82796771905756
1209
+ - type: manhattan_pearson
1210
+ value: 48.97445679282608
1211
+ - type: manhattan_spearman
1212
+ value: 58.69653501728678
1213
+ - task:
1214
+ type: STS
1215
+ dataset:
1216
+ type: mteb/stsbenchmark-sts
1217
+ name: MTEB STSBenchmark
1218
+ config: default
1219
+ split: test
1220
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
1221
+ metrics:
1222
+ - type: cos_sim_pearson
1223
+ value: 84.01368632144246
1224
+ - type: cos_sim_spearman
1225
+ value: 83.64169080274549
1226
+ - type: euclidean_pearson
1227
+ value: 75.84021692605727
1228
+ - type: euclidean_spearman
1229
+ value: 74.69132304226987
1230
+ - type: manhattan_pearson
1231
+ value: 75.9627059404693
1232
+ - type: manhattan_spearman
1233
+ value: 74.83616979158057
1234
+ - task:
1235
+ type: Reranking
1236
+ dataset:
1237
+ type: mteb/scidocs-reranking
1238
+ name: MTEB SciDocsRR
1239
+ config: default
1240
+ split: test
1241
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
1242
+ metrics:
1243
+ - type: map
1244
+ value: 81.63017243645893
1245
+ - type: mrr
1246
+ value: 94.79274900843528
1247
+ - task:
1248
+ type: Retrieval
1249
+ dataset:
1250
+ type: scifact
1251
+ name: MTEB SciFact
1252
+ config: default
1253
+ split: test
1254
+ revision: None
1255
+ metrics:
1256
+ - type: map_at_1
1257
+ value: 47.094
1258
+ - type: map_at_10
1259
+ value: 56.047000000000004
1260
+ - type: map_at_100
1261
+ value: 56.701
1262
+ - type: map_at_1000
1263
+ value: 56.742000000000004
1264
+ - type: map_at_3
1265
+ value: 53.189
1266
+ - type: map_at_5
1267
+ value: 54.464
1268
+ - type: mrr_at_1
1269
+ value: 50.0
1270
+ - type: mrr_at_10
1271
+ value: 57.567
1272
+ - type: mrr_at_100
1273
+ value: 58.104
1274
+ - type: mrr_at_1000
1275
+ value: 58.142
1276
+ - type: mrr_at_3
1277
+ value: 55.222
1278
+ - type: mrr_at_5
1279
+ value: 56.355999999999995
1280
+ - type: ndcg_at_1
1281
+ value: 50.0
1282
+ - type: ndcg_at_10
1283
+ value: 60.84
1284
+ - type: ndcg_at_100
1285
+ value: 63.983999999999995
1286
+ - type: ndcg_at_1000
1287
+ value: 65.19500000000001
1288
+ - type: ndcg_at_3
1289
+ value: 55.491
1290
+ - type: ndcg_at_5
1291
+ value: 57.51500000000001
1292
+ - type: precision_at_1
1293
+ value: 50.0
1294
+ - type: precision_at_10
1295
+ value: 8.366999999999999
1296
+ - type: precision_at_100
1297
+ value: 1.013
1298
+ - type: precision_at_1000
1299
+ value: 0.11199999999999999
1300
+ - type: precision_at_3
1301
+ value: 21.556
1302
+ - type: precision_at_5
1303
+ value: 14.2
1304
+ - type: recall_at_1
1305
+ value: 47.094
1306
+ - type: recall_at_10
1307
+ value: 74.239
1308
+ - type: recall_at_100
1309
+ value: 89.0
1310
+ - type: recall_at_1000
1311
+ value: 98.667
1312
+ - type: recall_at_3
1313
+ value: 59.606
1314
+ - type: recall_at_5
1315
+ value: 64.756
1316
+ - task:
1317
+ type: PairClassification
1318
+ dataset:
1319
+ type: mteb/sprintduplicatequestions-pairclassification
1320
+ name: MTEB SprintDuplicateQuestions
1321
+ config: default
1322
+ split: test
1323
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
1324
+ metrics:
1325
+ - type: cos_sim_accuracy
1326
+ value: 99.7128712871287
1327
+ - type: cos_sim_ap
1328
+ value: 91.8391173412632
1329
+ - type: cos_sim_f1
1330
+ value: 85.23421588594704
1331
+ - type: cos_sim_precision
1332
+ value: 86.82572614107885
1333
+ - type: cos_sim_recall
1334
+ value: 83.7
1335
+ - type: dot_accuracy
1336
+ value: 99.23960396039604
1337
+ - type: dot_ap
1338
+ value: 58.07268940033783
1339
+ - type: dot_f1
1340
+ value: 58.00486618004865
1341
+ - type: dot_precision
1342
+ value: 56.49289099526066
1343
+ - type: dot_recall
1344
+ value: 59.599999999999994
1345
+ - type: euclidean_accuracy
1346
+ value: 99.62574257425743
1347
+ - type: euclidean_ap
1348
+ value: 86.31145319031712
1349
+ - type: euclidean_f1
1350
+ value: 80.12486992715921
1351
+ - type: euclidean_precision
1352
+ value: 83.51409978308027
1353
+ - type: euclidean_recall
1354
+ value: 77.0
1355
+ - type: manhattan_accuracy
1356
+ value: 99.62178217821783
1357
+ - type: manhattan_ap
1358
+ value: 85.96697606381338
1359
+ - type: manhattan_f1
1360
+ value: 80.24193548387099
1361
+ - type: manhattan_precision
1362
+ value: 80.89430894308943
1363
+ - type: manhattan_recall
1364
+ value: 79.60000000000001
1365
+ - type: max_accuracy
1366
+ value: 99.7128712871287
1367
+ - type: max_ap
1368
+ value: 91.8391173412632
1369
+ - type: max_f1
1370
+ value: 85.23421588594704
1371
+ - task:
1372
+ type: Clustering
1373
+ dataset:
1374
+ type: mteb/stackexchange-clustering
1375
+ name: MTEB StackExchangeClustering
1376
+ config: default
1377
+ split: test
1378
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
1379
+ metrics:
1380
+ - type: v_measure
1381
+ value: 54.98955943181893
1382
+ - task:
1383
+ type: Clustering
1384
+ dataset:
1385
+ type: mteb/stackexchange-clustering-p2p
1386
+ name: MTEB StackExchangeClusteringP2P
1387
+ config: default
1388
+ split: test
1389
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
1390
+ metrics:
1391
+ - type: v_measure
1392
+ value: 32.72837687387049
1393
+ - task:
1394
+ type: Reranking
1395
+ dataset:
1396
+ type: mteb/stackoverflowdupquestions-reranking
1397
+ name: MTEB StackOverflowDupQuestions
1398
+ config: default
1399
+ split: test
1400
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
1401
+ metrics:
1402
+ - type: map
1403
+ value: 51.02207528482775
1404
+ - type: mrr
1405
+ value: 51.8842044393515
1406
+ - task:
1407
+ type: Summarization
1408
+ dataset:
1409
+ type: mteb/summeval
1410
+ name: MTEB SummEval
1411
+ config: default
1412
+ split: test
1413
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
1414
+ metrics:
1415
+ - type: cos_sim_pearson
1416
+ value: 30.250596893094876
1417
+ - type: cos_sim_spearman
1418
+ value: 30.609457706010158
1419
+ - type: dot_pearson
1420
+ value: 19.739579843052162
1421
+ - type: dot_spearman
1422
+ value: 20.27834051930579
1423
+ - task:
1424
+ type: Retrieval
1425
+ dataset:
1426
+ type: trec-covid
1427
+ name: MTEB TRECCOVID
1428
+ config: default
1429
+ split: test
1430
+ revision: None
1431
+ metrics:
1432
+ - type: map_at_1
1433
+ value: 0.187
1434
+ - type: map_at_10
1435
+ value: 1.239
1436
+ - type: map_at_100
1437
+ value: 6.388000000000001
1438
+ - type: map_at_1000
1439
+ value: 15.507000000000001
1440
+ - type: map_at_3
1441
+ value: 0.5
1442
+ - type: map_at_5
1443
+ value: 0.712
1444
+ - type: mrr_at_1
1445
+ value: 70.0
1446
+ - type: mrr_at_10
1447
+ value: 83.0
1448
+ - type: mrr_at_100
1449
+ value: 83.0
1450
+ - type: mrr_at_1000
1451
+ value: 83.0
1452
+ - type: mrr_at_3
1453
+ value: 81.667
1454
+ - type: mrr_at_5
1455
+ value: 82.667
1456
+ - type: ndcg_at_1
1457
+ value: 65.0
1458
+ - type: ndcg_at_10
1459
+ value: 56.57600000000001
1460
+ - type: ndcg_at_100
1461
+ value: 42.054
1462
+ - type: ndcg_at_1000
1463
+ value: 38.269999999999996
1464
+ - type: ndcg_at_3
1465
+ value: 63.134
1466
+ - type: ndcg_at_5
1467
+ value: 58.792
1468
+ - type: precision_at_1
1469
+ value: 70.0
1470
+ - type: precision_at_10
1471
+ value: 59.8
1472
+ - type: precision_at_100
1473
+ value: 42.5
1474
+ - type: precision_at_1000
1475
+ value: 17.304
1476
+ - type: precision_at_3
1477
+ value: 67.333
1478
+ - type: precision_at_5
1479
+ value: 62.4
1480
+ - type: recall_at_1
1481
+ value: 0.187
1482
+ - type: recall_at_10
1483
+ value: 1.529
1484
+ - type: recall_at_100
1485
+ value: 9.673
1486
+ - type: recall_at_1000
1487
+ value: 35.807
1488
+ - type: recall_at_3
1489
+ value: 0.5459999999999999
1490
+ - type: recall_at_5
1491
+ value: 0.8130000000000001
1492
+ - task:
1493
+ type: Retrieval
1494
+ dataset:
1495
+ type: webis-touche2020
1496
+ name: MTEB Touche2020
1497
+ config: default
1498
+ split: test
1499
+ revision: None
1500
+ metrics:
1501
+ - type: map_at_1
1502
+ value: 1.646
1503
+ - type: map_at_10
1504
+ value: 6.569999999999999
1505
+ - type: map_at_100
1506
+ value: 11.530999999999999
1507
+ - type: map_at_1000
1508
+ value: 13.009
1509
+ - type: map_at_3
1510
+ value: 3.234
1511
+ - type: map_at_5
1512
+ value: 4.956
1513
+ - type: mrr_at_1
1514
+ value: 18.367
1515
+ - type: mrr_at_10
1516
+ value: 35.121
1517
+ - type: mrr_at_100
1518
+ value: 36.142
1519
+ - type: mrr_at_1000
1520
+ value: 36.153
1521
+ - type: mrr_at_3
1522
+ value: 29.252
1523
+ - type: mrr_at_5
1524
+ value: 33.434999999999995
1525
+ - type: ndcg_at_1
1526
+ value: 16.326999999999998
1527
+ - type: ndcg_at_10
1528
+ value: 17.336
1529
+ - type: ndcg_at_100
1530
+ value: 28.925
1531
+ - type: ndcg_at_1000
1532
+ value: 41.346
1533
+ - type: ndcg_at_3
1534
+ value: 16.131999999999998
1535
+ - type: ndcg_at_5
1536
+ value: 18.107
1537
+ - type: precision_at_1
1538
+ value: 18.367
1539
+ - type: precision_at_10
1540
+ value: 16.531000000000002
1541
+ - type: precision_at_100
1542
+ value: 6.449000000000001
1543
+ - type: precision_at_1000
1544
+ value: 1.451
1545
+ - type: precision_at_3
1546
+ value: 17.687
1547
+ - type: precision_at_5
1548
+ value: 20.0
1549
+ - type: recall_at_1
1550
+ value: 1.646
1551
+ - type: recall_at_10
1552
+ value: 12.113
1553
+ - type: recall_at_100
1554
+ value: 40.261
1555
+ - type: recall_at_1000
1556
+ value: 77.878
1557
+ - type: recall_at_3
1558
+ value: 4.181
1559
+ - type: recall_at_5
1560
+ value: 7.744
1561
+ - task:
1562
+ type: Classification
1563
+ dataset:
1564
+ type: mteb/toxic_conversations_50k
1565
+ name: MTEB ToxicConversationsClassification
1566
+ config: default
1567
+ split: test
1568
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
1569
+ metrics:
1570
+ - type: accuracy
1571
+ value: 66.61500000000001
1572
+ - type: ap
1573
+ value: 11.70707762285034
1574
+ - type: f1
1575
+ value: 50.53259935502312
1576
+ - task:
1577
+ type: Classification
1578
+ dataset:
1579
+ type: mteb/tweet_sentiment_extraction
1580
+ name: MTEB TweetSentimentExtractionClassification
1581
+ config: default
1582
+ split: test
1583
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
1584
+ metrics:
1585
+ - type: accuracy
1586
+ value: 54.89247311827958
1587
+ - type: f1
1588
+ value: 55.044186334629586
1589
+ - task:
1590
+ type: Clustering
1591
+ dataset:
1592
+ type: mteb/twentynewsgroups-clustering
1593
+ name: MTEB TwentyNewsgroupsClustering
1594
+ config: default
1595
+ split: test
1596
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
1597
+ metrics:
1598
+ - type: v_measure
1599
+ value: 46.95851882042766
1600
+ - task:
1601
+ type: PairClassification
1602
+ dataset:
1603
+ type: mteb/twittersemeval2015-pairclassification
1604
+ name: MTEB TwitterSemEval2015
1605
+ config: default
1606
+ split: test
1607
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
1608
+ metrics:
1609
+ - type: cos_sim_accuracy
1610
+ value: 84.01978899684092
1611
+ - type: cos_sim_ap
1612
+ value: 68.10404793439619
1613
+ - type: cos_sim_f1
1614
+ value: 63.93145891154821
1615
+ - type: cos_sim_precision
1616
+ value: 58.905937291527685
1617
+ - type: cos_sim_recall
1618
+ value: 69.89445910290237
1619
+ - type: dot_accuracy
1620
+ value: 77.78506288370984
1621
+ - type: dot_ap
1622
+ value: 38.55636213255057
1623
+ - type: dot_f1
1624
+ value: 44.6866485013624
1625
+ - type: dot_precision
1626
+ value: 34.07202216066482
1627
+ - type: dot_recall
1628
+ value: 64.90765171503958
1629
+ - type: euclidean_accuracy
1630
+ value: 82.94093103653812
1631
+ - type: euclidean_ap
1632
+ value: 63.65596102723866
1633
+ - type: euclidean_f1
1634
+ value: 61.444903916322055
1635
+ - type: euclidean_precision
1636
+ value: 56.994584837545126
1637
+ - type: euclidean_recall
1638
+ value: 66.64907651715039
1639
+ - type: manhattan_accuracy
1640
+ value: 82.99457590749239
1641
+ - type: manhattan_ap
1642
+ value: 63.77653539498376
1643
+ - type: manhattan_f1
1644
+ value: 61.48299483235189
1645
+ - type: manhattan_precision
1646
+ value: 56.455528580887226
1647
+ - type: manhattan_recall
1648
+ value: 67.4934036939314
1649
+ - type: max_accuracy
1650
+ value: 84.01978899684092
1651
+ - type: max_ap
1652
+ value: 68.10404793439619
1653
+ - type: max_f1
1654
+ value: 63.93145891154821
1655
+ - task:
1656
+ type: PairClassification
1657
+ dataset:
1658
+ type: mteb/twitterurlcorpus-pairclassification
1659
+ name: MTEB TwitterURLCorpus
1660
+ config: default
1661
+ split: test
1662
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
1663
+ metrics:
1664
+ - type: cos_sim_accuracy
1665
+ value: 87.75177552683665
1666
+ - type: cos_sim_ap
1667
+ value: 83.75899853399007
1668
+ - type: cos_sim_f1
1669
+ value: 76.25022931572188
1670
+ - type: cos_sim_precision
1671
+ value: 72.83241045769958
1672
+ - type: cos_sim_recall
1673
+ value: 80.00461964890668
1674
+ - type: dot_accuracy
1675
+ value: 81.8197694725812
1676
+ - type: dot_ap
1677
+ value: 67.6851675345571
1678
+ - type: dot_f1
1679
+ value: 64.04501820589209
1680
+ - type: dot_precision
1681
+ value: 56.17233770758332
1682
+ - type: dot_recall
1683
+ value: 74.48413920542039
1684
+ - type: euclidean_accuracy
1685
+ value: 83.3003454030349
1686
+ - type: euclidean_ap
1687
+ value: 72.80186670461116
1688
+ - type: euclidean_f1
1689
+ value: 65.38000218078727
1690
+ - type: euclidean_precision
1691
+ value: 61.92082616179002
1692
+ - type: euclidean_recall
1693
+ value: 69.24853711117956
1694
+ - type: manhattan_accuracy
1695
+ value: 83.32169053440447
1696
+ - type: manhattan_ap
1697
+ value: 72.8243559753097
1698
+ - type: manhattan_f1
1699
+ value: 65.45939901157966
1700
+ - type: manhattan_precision
1701
+ value: 61.58284124075205
1702
+ - type: manhattan_recall
1703
+ value: 69.85679088389283
1704
+ - type: max_accuracy
1705
+ value: 87.75177552683665
1706
+ - type: max_ap
1707
+ value: 83.75899853399007
1708
+ - type: max_f1
1709
+ value: 76.25022931572188
1710
  ---
1711
 
1712
  <br><br>