shrijayan commited on
Commit
6f8f57f
1 Parent(s): b5c247c

Upload folder using huggingface_hub

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,594 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:378558
11
+ - loss:MultipleNegativesRankingLoss
12
+ base_model: intfloat/e5-base-v2
13
+ widget:
14
+ - source_sentence: Is intraoperative ketorolac an effective substitute for fentanyl
15
+ in children undergoing outpatient adenotonsillectomy?
16
+ sentences:
17
+ - Ketorolac showed no advantage over fentanyl in reducing the incidence of PONV
18
+ in children undergoing ADLAT.
19
+ - The patients with IgAN and their first relatives showed significant higher Gal
20
+ deficient IgA1 level than healthy controls, whereas patients spouses were the
21
+ same as healthy controls. It can be suggested that the Gal deficient IgA1 might
22
+ be inherited in Chinese patients with IgAN.
23
+ - Our results indicated that triptolide enhanced and enriched the stemness in the
24
+ PDAC cell lines at a low dose of 12.5 nM, but also resulted in the regression
25
+ of tumors derived from these cells.
26
+ - source_sentence: Is task specific fall prevention training effective for warfighters
27
+ with transtibial amputations?
28
+ sentences:
29
+ - These results indicate that task specific fall prevention training is an effective
30
+ rehabilitation method to reduce falls in persons with lower extremity transtibial
31
+ amputations.
32
+ - Don t press on the eye. For pain, give acetaminophen Tylenol . Don t give aspirin
33
+ or ibuprofen Advil, Motrin , because they can increase bleeding.
34
+ - Dermatophytes Trichophyton skin ,hair, ,nail Tri all Three Microsporum skin, hair
35
+ My head on head we have skin and hair Epidermophyton skin, nails
36
+ - source_sentence: Left horn of sinus venosus forms
37
+ sentences:
38
+ - Ki 67 expression is predictive of prognosis, and our prognostic model may become
39
+ a useful tool for predicting prognosis in patients with stage I II extranodal
40
+ NK T cell lymphoma, nasal type.
41
+ - Evidence described here suggests that IFN λ is a good candidate inhibitor of viral
42
+ replication in dengue infection. Mechanisms for the cellular and organismal interplay
43
+ between DENV and IFN λ need to be further studied as they could provide insights
44
+ into strategies to treat this disease. Furthermore, we report a novel epithelial
45
+ model to study dengue infection in vitro.
46
+ - Ans. A Coronary sinusRef Netter s Atlas of Human Embryology 2012 ed. pg. 96Heart
47
+ tube embryonic derivativesembryonic structureGives rise to Proximal 1 3rd of bulbus
48
+ cordisPrimitive trabeculated left ventricle Middle 1 3rd of bulbus cordisRight
49
+ and left ventricular outflow tract Distal 1 3rd of bulbus cordis truncus arteriosus
50
+ Ascending aorta and pulmonary trunk Left horn of sinus venosusCoronary sinus Right
51
+ horn of sinus venosusSmooth part of right atrium Right common cardinal nerve and
52
+ right anterior cardinal nerveSVC superior vena cava
53
+ - source_sentence: Is implementation of national diabetes retinal screening programme
54
+ associated with a lower proportion of patients referred to ophthalmology?
55
+ sentences:
56
+ - Introduction of a systematic retinal screening programme can reduce the proportion
57
+ of patients referred to the ophthalmology clinic, and use ophthalmology services
58
+ more efficiently.
59
+ - A Obesity Medications for the treatment of obesity can be classified as catecholaminergic
60
+ or serotonergic. Catecholaminergic medications include Amphetamines with high
61
+ abuse potential The Non Amphetamine schedule IV appetite suppressants Phentermine,
62
+ Diethyl propion Mazindol. The September 1997 withdrawal from the market of Flenfluramine
63
+ Defenfluramine has made true serotonergic appetite medications unavailable. The
64
+ SSRI antidepressants, E.g.., Fluoxetine Setraline, also have serotonergic activity
65
+ but are not approved by the FDA for weight loss.
66
+ - A i.e. Protein linked with glycosidic bond
67
+ - source_sentence: Does amyloid peptide regulate calcium homoeostasis and arrhythmogenesis
68
+ in pulmonary vein cardiomyocytes?
69
+ sentences:
70
+ - Hydroxy ethyl methaacrylate is a soft, flexible, water absorbing, plastic used
71
+ to make soft contact lenses. It is a polymer of 2 hydroxyethyl methacrylate HEMA
72
+ , a clear liquid component. Hard contact lenses are made from polymethyl methacrylate
73
+ PMMA and Silicon.
74
+ - Beta carotene has become popular in part because it s an antioxidant a substance
75
+ that may protect cells from damage. A number of studies show that people who eat
76
+ lots of fruits and vegetables that are rich in beta carotene and other vitamins
77
+ and minerals have a lower risk of some cancers and heart disease. However, so
78
+ far studies have not found that beta carotene supplements have the same health
79
+ benefits as foods.
80
+ - Aβ 25 35 has direct electrophysiological effects on PV cardiomyocytes.
81
+ pipeline_tag: sentence-similarity
82
+ library_name: sentence-transformers
83
+ metrics:
84
+ - cosine_accuracy
85
+ model-index:
86
+ - name: MPNet base trained on AllNLI triplets
87
+ results:
88
+ - task:
89
+ type: triplet
90
+ name: Triplet
91
+ dataset:
92
+ name: eval dataset
93
+ type: eval-dataset
94
+ metrics:
95
+ - type: cosine_accuracy
96
+ value: 0.9937447168216399
97
+ name: Cosine Accuracy
98
+ - task:
99
+ type: triplet
100
+ name: Triplet
101
+ dataset:
102
+ name: test dataset
103
+ type: test-dataset
104
+ metrics:
105
+ - type: cosine_accuracy
106
+ value: 0.9964285714285714
107
+ name: Cosine Accuracy
108
+ ---
109
+
110
+ # MPNet base trained on AllNLI triplets
111
+
112
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/e5-base-v2](https://huggingface.co/intfloat/e5-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
113
+
114
+ ## Model Details
115
+
116
+ ### Model Description
117
+ - **Model Type:** Sentence Transformer
118
+ - **Base model:** [intfloat/e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) <!-- at revision 1c644c92ad3ba1efdad3f1451a637716616a20e8 -->
119
+ - **Maximum Sequence Length:** 512 tokens
120
+ - **Output Dimensionality:** 768 dimensions
121
+ - **Similarity Function:** Cosine Similarity
122
+ <!-- - **Training Dataset:** Unknown -->
123
+ - **Language:** en
124
+ - **License:** apache-2.0
125
+
126
+ ### Model Sources
127
+
128
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
129
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
130
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
131
+
132
+ ### Full Model Architecture
133
+
134
+ ```
135
+ SentenceTransformer(
136
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
137
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
138
+ (2): Normalize()
139
+ )
140
+ ```
141
+
142
+ ## Usage
143
+
144
+ ### Direct Usage (Sentence Transformers)
145
+
146
+ First install the Sentence Transformers library:
147
+
148
+ ```bash
149
+ pip install -U sentence-transformers
150
+ ```
151
+
152
+ Then you can load this model and run inference.
153
+ ```python
154
+ from sentence_transformers import SentenceTransformer
155
+
156
+ # Download from the 🤗 Hub
157
+ model = SentenceTransformer("sentence_transformers_model_id")
158
+ # Run inference
159
+ sentences = [
160
+ 'Does amyloid peptide regulate calcium homoeostasis and arrhythmogenesis in pulmonary vein cardiomyocytes?',
161
+ 'Aβ 25 35 has direct electrophysiological effects on PV cardiomyocytes.',
162
+ 'Beta carotene has become popular in part because it s an antioxidant a substance that may protect cells from damage. A number of studies show that people who eat lots of fruits and vegetables that are rich in beta carotene and other vitamins and minerals have a lower risk of some cancers and heart disease. However, so far studies have not found that beta carotene supplements have the same health benefits as foods.',
163
+ ]
164
+ embeddings = model.encode(sentences)
165
+ print(embeddings.shape)
166
+ # [3, 768]
167
+
168
+ # Get the similarity scores for the embeddings
169
+ similarities = model.similarity(embeddings, embeddings)
170
+ print(similarities.shape)
171
+ # [3, 3]
172
+ ```
173
+
174
+ <!--
175
+ ### Direct Usage (Transformers)
176
+
177
+ <details><summary>Click to see the direct usage in Transformers</summary>
178
+
179
+ </details>
180
+ -->
181
+
182
+ <!--
183
+ ### Downstream Usage (Sentence Transformers)
184
+
185
+ You can finetune this model on your own dataset.
186
+
187
+ <details><summary>Click to expand</summary>
188
+
189
+ </details>
190
+ -->
191
+
192
+ <!--
193
+ ### Out-of-Scope Use
194
+
195
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
196
+ -->
197
+
198
+ ## Evaluation
199
+
200
+ ### Metrics
201
+
202
+ #### Triplet
203
+
204
+ * Datasets: `eval-dataset` and `test-dataset`
205
+ * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
206
+
207
+ | Metric | eval-dataset | test-dataset |
208
+ |:--------------------|:-------------|:-------------|
209
+ | **cosine_accuracy** | **0.9937** | **0.9964** |
210
+
211
+ <!--
212
+ ## Bias, Risks and Limitations
213
+
214
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
215
+ -->
216
+
217
+ <!--
218
+ ### Recommendations
219
+
220
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
221
+ -->
222
+
223
+ ## Training Details
224
+
225
+ ### Training Dataset
226
+
227
+ #### Unnamed Dataset
228
+
229
+
230
+ * Size: 378,558 training samples
231
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
232
+ * Approximate statistics based on the first 1000 samples:
233
+ | | sentence1 | sentence2 | label |
234
+ |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:--------------------------------------------------------------|
235
+ | type | string | string | float |
236
+ | details | <ul><li>min: 6 tokens</li><li>mean: 24.72 tokens</li><li>max: 147 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 88.11 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 1.0</li><li>mean: 1.0</li><li>max: 1.0</li></ul> |
237
+ * Samples:
238
+ | sentence1 | sentence2 | label |
239
+ |:--------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------|
240
+ | <code>Does tolbutamide alter glucose transport and metabolism in the embryonic mouse heart?</code> | <code>Tolbutamide stimulates glucose uptake and metabolism in the embryonic heart, as occurs in adult extra pancreatic tissues. Glut 1 and HKI, but not GRP78, are likely involved in tolbutamide induced cardiac dysmorphogenesis.</code> | <code>1.0</code> |
241
+ | <code>Do flk1 cells derived from mouse embryonic stem cells reconstitute hematopoiesis in vivo in SCID mice?</code> | <code>The Flk1 hematopoietic cells derived from ES cells reconstitute hematopoiesis in vivo and may become an alternative donor source for bone marrow transplantation.</code> | <code>1.0</code> |
242
+ | <code>Does systematic aging of degradable nanosuspension ameliorate vibrating mesh nebulizer performance?</code> | <code>Nebulization of purified nanosuspensions resulted in droplet diameters of 7.0 µm. However, electrolyte supplementation and storage, which led to an increase in sample conductivity 10 20 µS cm , were capable of providing smaller droplet diameters during vibrating mesh nebulization 5.0 µm . No relevant change of NP properties i.e. size, morphology, remaining mass and molecular weight of the employed polymer was observed when incubated at 22 C for two weeks.</code> | <code>1.0</code> |
243
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
244
+ ```json
245
+ {
246
+ "scale": 20.0,
247
+ "similarity_fct": "cos_sim"
248
+ }
249
+ ```
250
+
251
+ ### Evaluation Dataset
252
+
253
+ #### Unnamed Dataset
254
+
255
+
256
+ * Size: 47,320 evaluation samples
257
+ * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
258
+ * Approximate statistics based on the first 1000 samples:
259
+ | | sentence1 | sentence2 | label |
260
+ |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:--------------------------------------------------------------|
261
+ | type | string | string | float |
262
+ | details | <ul><li>min: 5 tokens</li><li>mean: 24.45 tokens</li><li>max: 253 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 87.68 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 1.0</li><li>mean: 1.0</li><li>max: 1.0</li></ul> |
263
+ * Samples:
264
+ | sentence1 | sentence2 | label |
265
+ |:-------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------|
266
+ | <code>Does thrombospondin 2 gene silencing in human aortic smooth muscle cells improve cell attachment?</code> | <code>siRNA mediated TSP 2 silencing of human aortic HAoSMCs improved cell attachment but had no effect on cell migration or proliferation. The effect on cell attachment was unrelated to changes in MMP activity.</code> | <code>1.0</code> |
267
+ | <code>What can you do to manage polycythemia vera?</code> | <code>Most people with polycythemia vera take low dose aspirin. There are a lot of ways you can keep yourself comfortable and as healthy as possible Don t smoke or chew tobacco. Tobacco makes blood vessels narrow, which can make blood clots more likely. Get some light exercise, such as walking, to help your circulation and keep your heart healthy. Do leg and ankle exercises to stop clots from forming in the veins of your legs. Your doctor or a physical therapist can show you how. Bathe or shower in cool water if warm water makes you itch. Keep your skin moist with lotion, and try not to scratch.</code> | <code>1.0</code> |
268
+ | <code>Is weekly nab paclitaxel safe and effective in 65 years old patients with metastatic breast cancer a post hoc analysis?</code> | <code>Weekly nab paclitaxel was safe and more efficacious compared with the q3w schedule and with solvent based taxanes in older patients with MBC.</code> | <code>1.0</code> |
269
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
270
+ ```json
271
+ {
272
+ "scale": 20.0,
273
+ "similarity_fct": "cos_sim"
274
+ }
275
+ ```
276
+
277
+ ### Training Hyperparameters
278
+ #### Non-Default Hyperparameters
279
+
280
+ - `do_predict`: True
281
+ - `eval_strategy`: steps
282
+ - `per_device_train_batch_size`: 32
283
+ - `per_device_eval_batch_size`: 32
284
+ - `num_train_epochs`: 1
285
+ - `warmup_ratio`: 0.1
286
+ - `fp16`: True
287
+ - `load_best_model_at_end`: True
288
+ - `batch_sampler`: no_duplicates
289
+
290
+ #### All Hyperparameters
291
+ <details><summary>Click to expand</summary>
292
+
293
+ - `overwrite_output_dir`: False
294
+ - `do_predict`: True
295
+ - `eval_strategy`: steps
296
+ - `prediction_loss_only`: True
297
+ - `per_device_train_batch_size`: 32
298
+ - `per_device_eval_batch_size`: 32
299
+ - `per_gpu_train_batch_size`: None
300
+ - `per_gpu_eval_batch_size`: None
301
+ - `gradient_accumulation_steps`: 1
302
+ - `eval_accumulation_steps`: None
303
+ - `torch_empty_cache_steps`: None
304
+ - `learning_rate`: 5e-05
305
+ - `weight_decay`: 0.0
306
+ - `adam_beta1`: 0.9
307
+ - `adam_beta2`: 0.999
308
+ - `adam_epsilon`: 1e-08
309
+ - `max_grad_norm`: 1.0
310
+ - `num_train_epochs`: 1
311
+ - `max_steps`: -1
312
+ - `lr_scheduler_type`: linear
313
+ - `lr_scheduler_kwargs`: {}
314
+ - `warmup_ratio`: 0.1
315
+ - `warmup_steps`: 0
316
+ - `log_level`: passive
317
+ - `log_level_replica`: warning
318
+ - `log_on_each_node`: True
319
+ - `logging_nan_inf_filter`: True
320
+ - `save_safetensors`: True
321
+ - `save_on_each_node`: False
322
+ - `save_only_model`: False
323
+ - `restore_callback_states_from_checkpoint`: False
324
+ - `no_cuda`: False
325
+ - `use_cpu`: False
326
+ - `use_mps_device`: False
327
+ - `seed`: 42
328
+ - `data_seed`: None
329
+ - `jit_mode_eval`: False
330
+ - `use_ipex`: False
331
+ - `bf16`: False
332
+ - `fp16`: True
333
+ - `fp16_opt_level`: O1
334
+ - `half_precision_backend`: auto
335
+ - `bf16_full_eval`: False
336
+ - `fp16_full_eval`: False
337
+ - `tf32`: None
338
+ - `local_rank`: 0
339
+ - `ddp_backend`: None
340
+ - `tpu_num_cores`: None
341
+ - `tpu_metrics_debug`: False
342
+ - `debug`: []
343
+ - `dataloader_drop_last`: False
344
+ - `dataloader_num_workers`: 0
345
+ - `dataloader_prefetch_factor`: None
346
+ - `past_index`: -1
347
+ - `disable_tqdm`: False
348
+ - `remove_unused_columns`: True
349
+ - `label_names`: None
350
+ - `load_best_model_at_end`: True
351
+ - `ignore_data_skip`: False
352
+ - `fsdp`: []
353
+ - `fsdp_min_num_params`: 0
354
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
355
+ - `fsdp_transformer_layer_cls_to_wrap`: None
356
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
357
+ - `deepspeed`: None
358
+ - `label_smoothing_factor`: 0.0
359
+ - `optim`: adamw_torch
360
+ - `optim_args`: None
361
+ - `adafactor`: False
362
+ - `group_by_length`: False
363
+ - `length_column_name`: length
364
+ - `ddp_find_unused_parameters`: None
365
+ - `ddp_bucket_cap_mb`: None
366
+ - `ddp_broadcast_buffers`: False
367
+ - `dataloader_pin_memory`: True
368
+ - `dataloader_persistent_workers`: False
369
+ - `skip_memory_metrics`: True
370
+ - `use_legacy_prediction_loop`: False
371
+ - `push_to_hub`: False
372
+ - `resume_from_checkpoint`: None
373
+ - `hub_model_id`: None
374
+ - `hub_strategy`: every_save
375
+ - `hub_private_repo`: False
376
+ - `hub_always_push`: False
377
+ - `gradient_checkpointing`: False
378
+ - `gradient_checkpointing_kwargs`: None
379
+ - `include_inputs_for_metrics`: False
380
+ - `include_for_metrics`: []
381
+ - `eval_do_concat_batches`: True
382
+ - `fp16_backend`: auto
383
+ - `push_to_hub_model_id`: None
384
+ - `push_to_hub_organization`: None
385
+ - `mp_parameters`:
386
+ - `auto_find_batch_size`: False
387
+ - `full_determinism`: False
388
+ - `torchdynamo`: None
389
+ - `ray_scope`: last
390
+ - `ddp_timeout`: 1800
391
+ - `torch_compile`: False
392
+ - `torch_compile_backend`: None
393
+ - `torch_compile_mode`: None
394
+ - `dispatch_batches`: None
395
+ - `split_batches`: None
396
+ - `include_tokens_per_second`: False
397
+ - `include_num_input_tokens_seen`: False
398
+ - `neftune_noise_alpha`: None
399
+ - `optim_target_modules`: None
400
+ - `batch_eval_metrics`: False
401
+ - `eval_on_start`: False
402
+ - `use_liger_kernel`: False
403
+ - `eval_use_gather_object`: False
404
+ - `average_tokens_across_devices`: False
405
+ - `prompts`: None
406
+ - `batch_sampler`: no_duplicates
407
+ - `multi_dataset_batch_sampler`: proportional
408
+
409
+ </details>
410
+
411
+ ### Training Logs
412
+ <details><summary>Click to expand</summary>
413
+
414
+ | Epoch | Step | Training Loss | Validation Loss | eval-dataset_cosine_accuracy | test-dataset_cosine_accuracy |
415
+ |:----------:|:--------:|:-------------:|:---------------:|:----------------------------:|:----------------------------:|
416
+ | 0 | 0 | - | - | 0.9813 | - |
417
+ | 0.0085 | 50 | 1.8471 | - | - | - |
418
+ | 0.0169 | 100 | 0.5244 | - | - | - |
419
+ | 0.0254 | 150 | 0.2175 | - | - | - |
420
+ | 0.0338 | 200 | 0.1392 | - | - | - |
421
+ | 0.0423 | 250 | 0.1437 | - | - | - |
422
+ | 0.0507 | 300 | 0.142 | - | - | - |
423
+ | 0.0592 | 350 | 0.1295 | - | - | - |
424
+ | 0.0676 | 400 | 0.1238 | - | - | - |
425
+ | 0.0761 | 450 | 0.14 | - | - | - |
426
+ | 0.0845 | 500 | 0.1173 | 0.1006 | 0.9931 | - |
427
+ | 0.0930 | 550 | 0.1236 | - | - | - |
428
+ | 0.1014 | 600 | 0.1127 | - | - | - |
429
+ | 0.1099 | 650 | 0.1338 | - | - | - |
430
+ | 0.1183 | 700 | 0.1071 | - | - | - |
431
+ | 0.1268 | 750 | 0.1149 | - | - | - |
432
+ | 0.1352 | 800 | 0.1072 | - | - | - |
433
+ | 0.1437 | 850 | 0.1117 | - | - | - |
434
+ | 0.1522 | 900 | 0.1087 | - | - | - |
435
+ | 0.1606 | 950 | 0.1242 | - | - | - |
436
+ | **0.1691** | **1000** | **0.1039** | **0.091** | **0.9965** | **-** |
437
+ | 0.1775 | 1050 | 0.1043 | - | - | - |
438
+ | 0.1860 | 1100 | 0.1193 | - | - | - |
439
+ | 0.1944 | 1150 | 0.1028 | - | - | - |
440
+ | 0.2029 | 1200 | 0.1027 | - | - | - |
441
+ | 0.2113 | 1250 | 0.1075 | - | - | - |
442
+ | 0.2198 | 1300 | 0.1177 | - | - | - |
443
+ | 0.2282 | 1350 | 0.0937 | - | - | - |
444
+ | 0.2367 | 1400 | 0.1095 | - | - | - |
445
+ | 0.2451 | 1450 | 0.1054 | - | - | - |
446
+ | 0.2536 | 1500 | 0.1003 | 0.0798 | 0.9958 | - |
447
+ | 0.2620 | 1550 | 0.0952 | - | - | - |
448
+ | 0.2705 | 1600 | 0.1028 | - | - | - |
449
+ | 0.2790 | 1650 | 0.0988 | - | - | - |
450
+ | 0.2874 | 1700 | 0.0887 | - | - | - |
451
+ | 0.2959 | 1750 | 0.1027 | - | - | - |
452
+ | 0.3043 | 1800 | 0.0937 | - | - | - |
453
+ | 0.3128 | 1850 | 0.1031 | - | - | - |
454
+ | 0.3212 | 1900 | 0.0857 | - | - | - |
455
+ | 0.3297 | 1950 | 0.094 | - | - | - |
456
+ | 0.3381 | 2000 | 0.1044 | 0.0721 | 0.9954 | - |
457
+ | 0.3466 | 2050 | 0.0829 | - | - | - |
458
+ | 0.3550 | 2100 | 0.0934 | - | - | - |
459
+ | 0.3635 | 2150 | 0.0785 | - | - | - |
460
+ | 0.3719 | 2200 | 0.0938 | - | - | - |
461
+ | 0.3804 | 2250 | 0.0885 | - | - | - |
462
+ | 0.3888 | 2300 | 0.0907 | - | - | - |
463
+ | 0.3973 | 2350 | 0.0911 | - | - | - |
464
+ | 0.4057 | 2400 | 0.0891 | - | - | - |
465
+ | 0.4142 | 2450 | 0.0798 | - | - | - |
466
+ | 0.4227 | 2500 | 0.0856 | 0.0655 | 0.9935 | - |
467
+ | 0.4311 | 2550 | 0.0925 | - | - | - |
468
+ | 0.4396 | 2600 | 0.0778 | - | - | - |
469
+ | 0.4480 | 2650 | 0.0871 | - | - | - |
470
+ | 0.4565 | 2700 | 0.0769 | - | - | - |
471
+ | 0.4649 | 2750 | 0.0815 | - | - | - |
472
+ | 0.4734 | 2800 | 0.0697 | - | - | - |
473
+ | 0.4818 | 2850 | 0.0714 | - | - | - |
474
+ | 0.4903 | 2900 | 0.0788 | - | - | - |
475
+ | 0.4987 | 2950 | 0.0772 | - | - | - |
476
+ | 0.5072 | 3000 | 0.0825 | 0.0618 | 0.9917 | - |
477
+ | 0.5156 | 3050 | 0.0742 | - | - | - |
478
+ | 0.5241 | 3100 | 0.0784 | - | - | - |
479
+ | 0.5325 | 3150 | 0.0697 | - | - | - |
480
+ | 0.5410 | 3200 | 0.0791 | - | - | - |
481
+ | 0.5495 | 3250 | 0.0657 | - | - | - |
482
+ | 0.5579 | 3300 | 0.0779 | - | - | - |
483
+ | 0.5664 | 3350 | 0.0719 | - | - | - |
484
+ | 0.5748 | 3400 | 0.0656 | - | - | - |
485
+ | 0.5833 | 3450 | 0.0698 | - | - | - |
486
+ | 0.5917 | 3500 | 0.0678 | 0.0578 | 0.9903 | - |
487
+ | 0.6002 | 3550 | 0.0771 | - | - | - |
488
+ | 0.6086 | 3600 | 0.0645 | - | - | - |
489
+ | 0.6171 | 3650 | 0.078 | - | - | - |
490
+ | 0.6255 | 3700 | 0.064 | - | - | - |
491
+ | 0.6340 | 3750 | 0.0691 | - | - | - |
492
+ | 0.6424 | 3800 | 0.0634 | - | - | - |
493
+ | 0.6509 | 3850 | 0.0732 | - | - | - |
494
+ | 0.6593 | 3900 | 0.059 | - | - | - |
495
+ | 0.6678 | 3950 | 0.0671 | - | - | - |
496
+ | 0.6762 | 4000 | 0.0633 | 0.0552 | 0.9936 | - |
497
+ | 0.6847 | 4050 | 0.0732 | - | - | - |
498
+ | 0.6932 | 4100 | 0.0593 | - | - | - |
499
+ | 0.7016 | 4150 | 0.0639 | - | - | - |
500
+ | 0.7101 | 4200 | 0.0672 | - | - | - |
501
+ | 0.7185 | 4250 | 0.0604 | - | - | - |
502
+ | 0.7270 | 4300 | 0.0666 | - | - | - |
503
+ | 0.7354 | 4350 | 0.0594 | - | - | - |
504
+ | 0.7439 | 4400 | 0.0783 | - | - | - |
505
+ | 0.7523 | 4450 | 0.0654 | - | - | - |
506
+ | 0.7608 | 4500 | 0.0596 | 0.0520 | 0.9937 | - |
507
+ | 0.7692 | 4550 | 0.0654 | - | - | - |
508
+ | 0.7777 | 4600 | 0.0511 | - | - | - |
509
+ | 0.7861 | 4650 | 0.0641 | - | - | - |
510
+ | 0.7946 | 4700 | 0.0609 | - | - | - |
511
+ | 0.8030 | 4750 | 0.0591 | - | - | - |
512
+ | 0.8115 | 4800 | 0.0496 | - | - | - |
513
+ | 0.8199 | 4850 | 0.0624 | - | - | - |
514
+ | 0.8284 | 4900 | 0.0639 | - | - | - |
515
+ | 0.8369 | 4950 | 0.056 | - | - | - |
516
+ | 0.8453 | 5000 | 0.0641 | 0.0487 | 0.9947 | - |
517
+ | 0.8538 | 5050 | 0.0608 | - | - | - |
518
+ | 0.8622 | 5100 | 0.0725 | - | - | - |
519
+ | 0.8707 | 5150 | 0.055 | - | - | - |
520
+ | 0.8791 | 5200 | 0.0556 | - | - | - |
521
+ | 0.8876 | 5250 | 0.0489 | - | - | - |
522
+ | 0.8960 | 5300 | 0.0513 | - | - | - |
523
+ | 0.9045 | 5350 | 0.0493 | - | - | - |
524
+ | 0.9129 | 5400 | 0.0574 | - | - | - |
525
+ | 0.9214 | 5450 | 0.0665 | - | - | - |
526
+ | 0.9298 | 5500 | 0.0588 | 0.0475 | 0.9937 | - |
527
+ | 0.9383 | 5550 | 0.0557 | - | - | - |
528
+ | 0.9467 | 5600 | 0.0497 | - | - | - |
529
+ | 0.9552 | 5650 | 0.0592 | - | - | - |
530
+ | 0.9637 | 5700 | 0.0526 | - | - | - |
531
+ | 0.9721 | 5750 | 0.0683 | - | - | - |
532
+ | 0.9806 | 5800 | 0.0588 | - | - | - |
533
+ | 0.9890 | 5850 | 0.0541 | - | - | - |
534
+ | 0.9975 | 5900 | 0.0636 | - | - | - |
535
+ | 1.0 | 5915 | - | - | - | 0.9964 |
536
+
537
+ * The bold row denotes the saved checkpoint.
538
+ </details>
539
+
540
+ ### Framework Versions
541
+ - Python: 3.11.10
542
+ - Sentence Transformers: 3.3.0
543
+ - Transformers: 4.46.2
544
+ - PyTorch: 2.5.1+cu124
545
+ - Accelerate: 1.1.1
546
+ - Datasets: 3.1.0
547
+ - Tokenizers: 0.20.3
548
+
549
+ ## Citation
550
+
551
+ ### BibTeX
552
+
553
+ #### Sentence Transformers
554
+ ```bibtex
555
+ @inproceedings{reimers-2019-sentence-bert,
556
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
557
+ author = "Reimers, Nils and Gurevych, Iryna",
558
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
559
+ month = "11",
560
+ year = "2019",
561
+ publisher = "Association for Computational Linguistics",
562
+ url = "https://arxiv.org/abs/1908.10084",
563
+ }
564
+ ```
565
+
566
+ #### MultipleNegativesRankingLoss
567
+ ```bibtex
568
+ @misc{henderson2017efficient,
569
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
570
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
571
+ year={2017},
572
+ eprint={1705.00652},
573
+ archivePrefix={arXiv},
574
+ primaryClass={cs.CL}
575
+ }
576
+ ```
577
+
578
+ <!--
579
+ ## Glossary
580
+
581
+ *Clearly define terms in order to be accessible across audiences.*
582
+ -->
583
+
584
+ <!--
585
+ ## Model Card Authors
586
+
587
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
588
+ -->
589
+
590
+ <!--
591
+ ## Model Card Contact
592
+
593
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
594
+ -->
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "intfloat/e5-base-v2",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.46.2",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.0",
4
+ "transformers": "4.46.2",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
evaluation/mteb_results/no_model_name_available/no_revision_available/NFCorpus.json ADDED
@@ -0,0 +1,158 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dataset_revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814",
3
+ "evaluation_time": 9.435755968093872,
4
+ "kg_co2_emissions": null,
5
+ "mteb_version": "1.19.4",
6
+ "scores": {
7
+ "test": [
8
+ {
9
+ "hf_subset": "default",
10
+ "languages": [
11
+ "eng-Latn"
12
+ ],
13
+ "main_score": 0.32998,
14
+ "map_at_1": 0.05382,
15
+ "map_at_10": 0.11481,
16
+ "map_at_100": 0.1513,
17
+ "map_at_1000": 0.16691,
18
+ "map_at_20": 0.1299,
19
+ "map_at_3": 0.08062,
20
+ "map_at_5": 0.09595,
21
+ "mrr_at_1": 0.42724458204334365,
22
+ "mrr_at_10": 0.5175156027323209,
23
+ "mrr_at_100": 0.5232891043612237,
24
+ "mrr_at_1000": 0.5237590792840204,
25
+ "mrr_at_20": 0.5210801677674504,
26
+ "mrr_at_3": 0.4932920536635707,
27
+ "mrr_at_5": 0.5078431372549018,
28
+ "nauc_map_at_1000_diff1": 0.3424621761827222,
29
+ "nauc_map_at_1000_max": 0.3693102088918577,
30
+ "nauc_map_at_1000_std": 0.23406888805862447,
31
+ "nauc_map_at_100_diff1": 0.3664032244991238,
32
+ "nauc_map_at_100_max": 0.3616164046890892,
33
+ "nauc_map_at_100_std": 0.19306556860178878,
34
+ "nauc_map_at_10_diff1": 0.4156009435131605,
35
+ "nauc_map_at_10_max": 0.29562996580170675,
36
+ "nauc_map_at_10_std": 0.06704195423653216,
37
+ "nauc_map_at_1_diff1": 0.5404652621843595,
38
+ "nauc_map_at_1_max": 0.15666145095222495,
39
+ "nauc_map_at_1_std": -0.06924305439448407,
40
+ "nauc_map_at_20_diff1": 0.39116110240291857,
41
+ "nauc_map_at_20_max": 0.32656228345644944,
42
+ "nauc_map_at_20_std": 0.11943206233983186,
43
+ "nauc_map_at_3_diff1": 0.4839472491919537,
44
+ "nauc_map_at_3_max": 0.21295648818166926,
45
+ "nauc_map_at_3_std": -0.030806258280614233,
46
+ "nauc_map_at_5_diff1": 0.4587164068060277,
47
+ "nauc_map_at_5_max": 0.25959085498123746,
48
+ "nauc_map_at_5_std": 0.002151856366699112,
49
+ "nauc_mrr_at_1000_diff1": 0.4143698214465788,
50
+ "nauc_mrr_at_1000_max": 0.512000879689416,
51
+ "nauc_mrr_at_1000_std": 0.3428040396879113,
52
+ "nauc_mrr_at_100_diff1": 0.41439869757870795,
53
+ "nauc_mrr_at_100_max": 0.5123272431773179,
54
+ "nauc_mrr_at_100_std": 0.3431494307015444,
55
+ "nauc_mrr_at_10_diff1": 0.41437744530026277,
56
+ "nauc_mrr_at_10_max": 0.5099161545430382,
57
+ "nauc_mrr_at_10_std": 0.33964773082836386,
58
+ "nauc_mrr_at_1_diff1": 0.4231839574482019,
59
+ "nauc_mrr_at_1_max": 0.4469665807737527,
60
+ "nauc_mrr_at_1_std": 0.27085917262801257,
61
+ "nauc_mrr_at_20_diff1": 0.4152648392708018,
62
+ "nauc_mrr_at_20_max": 0.5142497177973875,
63
+ "nauc_mrr_at_20_std": 0.34453128842595215,
64
+ "nauc_mrr_at_3_diff1": 0.4108552343198126,
65
+ "nauc_mrr_at_3_max": 0.49298778917009023,
66
+ "nauc_mrr_at_3_std": 0.32896950968687616,
67
+ "nauc_mrr_at_5_diff1": 0.41651963935747843,
68
+ "nauc_mrr_at_5_max": 0.5096629256087836,
69
+ "nauc_mrr_at_5_std": 0.3328685043125068,
70
+ "nauc_ndcg_at_1000_diff1": 0.3039131747834975,
71
+ "nauc_ndcg_at_1000_max": 0.5006861530711514,
72
+ "nauc_ndcg_at_1000_std": 0.4082629864300754,
73
+ "nauc_ndcg_at_100_diff1": 0.3127438588402751,
74
+ "nauc_ndcg_at_100_max": 0.45224046265925877,
75
+ "nauc_ndcg_at_100_std": 0.3425023156539249,
76
+ "nauc_ndcg_at_10_diff1": 0.28441439640368743,
77
+ "nauc_ndcg_at_10_max": 0.43810699899062217,
78
+ "nauc_ndcg_at_10_std": 0.3355576417713615,
79
+ "nauc_ndcg_at_1_diff1": 0.4194205464421708,
80
+ "nauc_ndcg_at_1_max": 0.41498596839428925,
81
+ "nauc_ndcg_at_1_std": 0.27125785456035756,
82
+ "nauc_ndcg_at_20_diff1": 0.2710726283110113,
83
+ "nauc_ndcg_at_20_max": 0.43015508471743263,
84
+ "nauc_ndcg_at_20_std": 0.34122233366577315,
85
+ "nauc_ndcg_at_3_diff1": 0.3285342754684657,
86
+ "nauc_ndcg_at_3_max": 0.42165508775193616,
87
+ "nauc_ndcg_at_3_std": 0.28577367095585693,
88
+ "nauc_ndcg_at_5_diff1": 0.311559459754687,
89
+ "nauc_ndcg_at_5_max": 0.4434562563215353,
90
+ "nauc_ndcg_at_5_std": 0.30904937508002883,
91
+ "nauc_precision_at_1000_diff1": -0.11106426370716928,
92
+ "nauc_precision_at_1000_max": 0.08177717612316253,
93
+ "nauc_precision_at_1000_std": 0.3025166933389465,
94
+ "nauc_precision_at_100_diff1": -0.07472866467160653,
95
+ "nauc_precision_at_100_max": 0.22620520811654854,
96
+ "nauc_precision_at_100_std": 0.41579299300237504,
97
+ "nauc_precision_at_10_diff1": 0.07728674949325742,
98
+ "nauc_precision_at_10_max": 0.41474780473040623,
99
+ "nauc_precision_at_10_std": 0.42978671580761185,
100
+ "nauc_precision_at_1_diff1": 0.43133998171448573,
101
+ "nauc_precision_at_1_max": 0.4416446045088822,
102
+ "nauc_precision_at_1_std": 0.2753392661341198,
103
+ "nauc_precision_at_20_diff1": 0.009146810147417889,
104
+ "nauc_precision_at_20_max": 0.35208630930337087,
105
+ "nauc_precision_at_20_std": 0.4439061428651268,
106
+ "nauc_precision_at_3_diff1": 0.23722090687165115,
107
+ "nauc_precision_at_3_max": 0.439996027141254,
108
+ "nauc_precision_at_3_std": 0.3288230491536673,
109
+ "nauc_precision_at_5_diff1": 0.1780802066482304,
110
+ "nauc_precision_at_5_max": 0.4588297124377219,
111
+ "nauc_precision_at_5_std": 0.3744491247477144,
112
+ "nauc_recall_at_1000_diff1": 0.10363785189874622,
113
+ "nauc_recall_at_1000_max": 0.2681126176662348,
114
+ "nauc_recall_at_1000_std": 0.27422038039510177,
115
+ "nauc_recall_at_100_diff1": 0.2639497848749735,
116
+ "nauc_recall_at_100_max": 0.3059527509909492,
117
+ "nauc_recall_at_100_std": 0.24325834981106828,
118
+ "nauc_recall_at_10_diff1": 0.3442826991641897,
119
+ "nauc_recall_at_10_max": 0.2803843621484127,
120
+ "nauc_recall_at_10_std": 0.08910309647424257,
121
+ "nauc_recall_at_1_diff1": 0.5404652621843595,
122
+ "nauc_recall_at_1_max": 0.15666145095222495,
123
+ "nauc_recall_at_1_std": -0.06924305439448407,
124
+ "nauc_recall_at_20_diff1": 0.2913829064591792,
125
+ "nauc_recall_at_20_max": 0.2899553080148023,
126
+ "nauc_recall_at_20_std": 0.12012278568819378,
127
+ "nauc_recall_at_3_diff1": 0.4520497431070753,
128
+ "nauc_recall_at_3_max": 0.2051619690797763,
129
+ "nauc_recall_at_3_std": -0.03464997708862103,
130
+ "nauc_recall_at_5_diff1": 0.418141869398435,
131
+ "nauc_recall_at_5_max": 0.26027335173262683,
132
+ "nauc_recall_at_5_std": 0.004168262404352809,
133
+ "ndcg_at_1": 0.40867,
134
+ "ndcg_at_10": 0.32998,
135
+ "ndcg_at_100": 0.3096,
136
+ "ndcg_at_1000": 0.39921,
137
+ "ndcg_at_20": 0.31274,
138
+ "ndcg_at_3": 0.37019,
139
+ "ndcg_at_5": 0.35708,
140
+ "precision_at_1": 0.42415,
141
+ "precision_at_10": 0.25418,
142
+ "precision_at_100": 0.0839,
143
+ "precision_at_1000": 0.0212,
144
+ "precision_at_20": 0.19381,
145
+ "precision_at_3": 0.35191,
146
+ "precision_at_5": 0.31703,
147
+ "recall_at_1": 0.05382,
148
+ "recall_at_10": 0.15584,
149
+ "recall_at_100": 0.32328,
150
+ "recall_at_1000": 0.64829,
151
+ "recall_at_20": 0.20038,
152
+ "recall_at_3": 0.08847,
153
+ "recall_at_5": 0.11553
154
+ }
155
+ ]
156
+ },
157
+ "task_name": "NFCorpus"
158
+ }
evaluation/mteb_results/no_model_name_available/no_revision_available/model_meta.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"name": "no_model_name_available", "revision": "no_revision_available", "release_date": null, "languages": [], "n_parameters": null, "memory_usage": null, "max_tokens": null, "embed_dim": null, "license": null, "open_weights": null, "public_training_data": null, "public_training_code": null, "framework": ["Sentence Transformers"], "reference": null, "similarity_fn_name": "cosine", "use_instructions": null, "zero_shot_benchmarks": null, "loader": null}
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9d22f5ebf1e058390f059b8fb5293a85ad58d8b85527dc44da4f0416459f5a79
3
+ size 437951328
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "mask_token": "[MASK]",
48
+ "model_max_length": 512,
49
+ "pad_token": "[PAD]",
50
+ "sep_token": "[SEP]",
51
+ "strip_accents": null,
52
+ "tokenize_chinese_chars": true,
53
+ "tokenizer_class": "BertTokenizer",
54
+ "unk_token": "[UNK]"
55
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff