metadata
base_model: microsoft/deberta-v3-small
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
- cosine_accuracy
- cosine_accuracy_threshold
- cosine_f1
- cosine_f1_threshold
- cosine_precision
- cosine_recall
- cosine_ap
- dot_accuracy
- dot_accuracy_threshold
- dot_f1
- dot_f1_threshold
- dot_precision
- dot_recall
- dot_ap
- manhattan_accuracy
- manhattan_accuracy_threshold
- manhattan_f1
- manhattan_f1_threshold
- manhattan_precision
- manhattan_recall
- manhattan_ap
- euclidean_accuracy
- euclidean_accuracy_threshold
- euclidean_f1
- euclidean_f1_threshold
- euclidean_precision
- euclidean_recall
- euclidean_ap
- max_accuracy
- max_accuracy_threshold
- max_f1
- max_f1_threshold
- max_precision
- max_recall
- max_ap
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:32500
- loss:GISTEmbedLoss
widget:
- source_sentence: phase changes do not change
sentences:
- >-
The major Atlantic slave trading nations, ordered by trade volume, were
the Portuguese, the British, the Spanish, the French, the Dutch, and the
Danish. Several had established outposts on the African coast where they
purchased slaves from local African leaders.
- >-
phase changes do not change mass. Particles have mass, but mass is
energy.
phase changes do not change energy
- >-
According to the U.S. Census Bureau , the county is a total area of ,
which has land and ( 0.2 % ) is water .
- source_sentence: what jobs can you get with a bachelor degree in anthropology?
sentences:
- >-
To determine the atomic weight of an element, you should add up protons
and neutrons.
- >-
['Paleontologist*', 'Archaeologist*', 'University Professor*', 'Market
Research Analyst*', 'Primatologist.', 'Forensic Scientist*', 'Medical
Anthropologist.', 'Museum Technician.']
- >-
The wingspan flies , the moth comes depending on the location from July
to August .
- source_sentence: Identify different forms of energy (e.g., light, sound, heat).
sentences:
- >-
`` Irreplaceable '' '' remained on the chart for thirty weeks , and was
certified double-platinum by the Recording Industry Association of
America ( RIAA ) , denoting sales of two million downloads , and had
sold over 3,139,000 paid digital downloads in the US as of October 2012
, according to Nielsen SoundScan . ''
- >-
On Rotten Tomatoes , the film has a rating of 63 % , based on 87 reviews
, with an average rating of 5.9/10 .
- Heat, light, and sound are all different forms of energy.
- source_sentence: what is so small it can only be seen with an electron microscope?
sentences:
- >-
Viruses are so small that they can be seen only with an electron
microscope.. Where most viruses are DNA, HIV is an RNA virus.
HIV is so small it can only be seen with an electron microscope
- >-
The development of modern lasers has opened many doors to both research
and applications. A laser beam was used to measure the distance from the
Earth to the moon. Lasers are important components of CD players. As the
image above illustrates, lasers can provide precise focusing of beams to
selectively destroy cancer cells in patients. The ability of a laser to
focus precisely is due to high-quality crystals that help give rise to
the laser beam. A variety of techniques are used to manufacture pure
crystals for use in lasers.
- >-
Discussion for (a) This value is the net work done on the package. The
person actually does more work than this, because friction opposes the
motion. Friction does negative work and removes some of the energy the
person expends and converts it to thermal energy. The net work equals
the sum of the work done by each individual force. Strategy and Concept
for (b) The forces acting on the package are gravity, the normal force,
the force of friction, and the applied force. The normal force and force
of gravity are each perpendicular to the displacement, and therefore do
no work. Solution for (b) The applied force does work.
- source_sentence: what aspects of your environment may relate to the epidemic of obesity
sentences:
- >-
Jan Kromkamp ( born August 17 , 1980 in Makkinga , Netherlands ) is a
Dutch footballer .
- >-
When chemicals in solution react, the proper way of writing the chemical
formulas of the dissolved ionic compounds is in terms of the dissociated
ions, not the complete ionic formula. A complete ionic equation is a
chemical equation in which the dissolved ionic compounds are written as
separated ions. Solubility rules are very useful in determining which
ionic compounds are dissolved and which are not. For example, when
NaCl(aq) reacts with AgNO3(aq) in a double-replacement reaction to
precipitate AgCl(s) and form NaNO3(aq), the complete ionic equation
includes NaCl, AgNO3, and NaNO3 written as separated ions:.
- >-
Genetic changes in human populations occur too slowly to be responsible
for the obesity epidemic. Nevertheless, the variation in how people
respond to the environment that promotes physical inactivity and intake
of high-calorie foods suggests that genes do play a role in the
development of obesity.
model-index:
- name: SentenceTransformer based on microsoft/deberta-v3-small
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test
type: sts-test
metrics:
- type: pearson_cosine
value: 0.5377382226514003
name: Pearson Cosine
- type: spearman_cosine
value: 0.5410237309359288
name: Spearman Cosine
- type: pearson_manhattan
value: 0.5464293120330461
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.5401021234588343
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.5469897917607747
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.5409800984560722
name: Spearman Euclidean
- type: pearson_dot
value: 0.5376496659087263
name: Pearson Dot
- type: spearman_dot
value: 0.5408811086658744
name: Spearman Dot
- type: pearson_max
value: 0.5469897917607747
name: Pearson Max
- type: spearman_max
value: 0.5410237309359288
name: Spearman Max
- task:
type: binary-classification
name: Binary Classification
dataset:
name: allNLI dev
type: allNLI-dev
metrics:
- type: cosine_accuracy
value: 0.68359375
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.9088386297225952
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.5350553505535056
name: Cosine F1
- type: cosine_f1_threshold
value: 0.8140230178833008
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.39295392953929537
name: Cosine Precision
- type: cosine_recall
value: 0.838150289017341
name: Cosine Recall
- type: cosine_ap
value: 0.48873606015680937
name: Cosine Ap
- type: dot_accuracy
value: 0.68359375
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 699.0950927734375
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.5350553505535056
name: Dot F1
- type: dot_f1_threshold
value: 625.3240356445312
name: Dot F1 Threshold
- type: dot_precision
value: 0.39295392953929537
name: Dot Precision
- type: dot_recall
value: 0.838150289017341
name: Dot Recall
- type: dot_ap
value: 0.48885724782911755
name: Dot Ap
- type: manhattan_accuracy
value: 0.68359375
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 256.45477294921875
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.5396145610278372
name: Manhattan F1
- type: manhattan_f1_threshold
value: 339.225830078125
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.42857142857142855
name: Manhattan Precision
- type: manhattan_recall
value: 0.7283236994219653
name: Manhattan Recall
- type: manhattan_ap
value: 0.4920209563997524
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.68359375
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 11.834823608398438
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.5350553505535056
name: Euclidean F1
- type: euclidean_f1_threshold
value: 16.90357780456543
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.39295392953929537
name: Euclidean Precision
- type: euclidean_recall
value: 0.838150289017341
name: Euclidean Recall
- type: euclidean_ap
value: 0.4887203371983184
name: Euclidean Ap
- type: max_accuracy
value: 0.68359375
name: Max Accuracy
- type: max_accuracy_threshold
value: 699.0950927734375
name: Max Accuracy Threshold
- type: max_f1
value: 0.5396145610278372
name: Max F1
- type: max_f1_threshold
value: 625.3240356445312
name: Max F1 Threshold
- type: max_precision
value: 0.42857142857142855
name: Max Precision
- type: max_recall
value: 0.838150289017341
name: Max Recall
- type: max_ap
value: 0.4920209563997524
name: Max Ap
- task:
type: binary-classification
name: Binary Classification
dataset:
name: Qnli dev
type: Qnli-dev
metrics:
- type: cosine_accuracy
value: 0.693359375
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.8319265842437744
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.685337726523888
name: Cosine F1
- type: cosine_f1_threshold
value: 0.74552983045578
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.5606469002695418
name: Cosine Precision
- type: cosine_recall
value: 0.8813559322033898
name: Cosine Recall
- type: cosine_ap
value: 0.6873625888187367
name: Cosine Ap
- type: dot_accuracy
value: 0.693359375
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 639.0776977539062
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.685337726523888
name: Dot F1
- type: dot_f1_threshold
value: 572.7136840820312
name: Dot F1 Threshold
- type: dot_precision
value: 0.5606469002695418
name: Dot Precision
- type: dot_recall
value: 0.8813559322033898
name: Dot Recall
- type: dot_ap
value: 0.6878718449643791
name: Dot Ap
- type: manhattan_accuracy
value: 0.69921875
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 362.1485900878906
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.6857142857142857
name: Manhattan F1
- type: manhattan_f1_threshold
value: 430.38519287109375
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.5682451253481894
name: Manhattan Precision
- type: manhattan_recall
value: 0.864406779661017
name: Manhattan Recall
- type: manhattan_ap
value: 0.6874910715870401
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.693359375
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 16.06937026977539
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.685337726523888
name: Euclidean F1
- type: euclidean_f1_threshold
value: 19.772865295410156
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.5606469002695418
name: Euclidean Precision
- type: euclidean_recall
value: 0.8813559322033898
name: Euclidean Recall
- type: euclidean_ap
value: 0.6873686687008952
name: Euclidean Ap
- type: max_accuracy
value: 0.69921875
name: Max Accuracy
- type: max_accuracy_threshold
value: 639.0776977539062
name: Max Accuracy Threshold
- type: max_f1
value: 0.6857142857142857
name: Max F1
- type: max_f1_threshold
value: 572.7136840820312
name: Max F1 Threshold
- type: max_precision
value: 0.5682451253481894
name: Max Precision
- type: max_recall
value: 0.8813559322033898
name: Max Recall
- type: max_ap
value: 0.6878718449643791
name: Max Ap
SentenceTransformer based on microsoft/deberta-v3-small
This is a sentence-transformers model finetuned from microsoft/deberta-v3-small. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: microsoft/deberta-v3-small
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 tokens
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model
(1): AdvancedWeightedPooling(
(alpha_dropout_layer): Dropout(p=0.05, inplace=False)
(gate_dropout_layer): Dropout(p=0.0, inplace=False)
(linear_cls_Qpj): Linear(in_features=768, out_features=768, bias=True)
(linear_attnOut): Linear(in_features=768, out_features=768, bias=True)
(mha): MultiheadAttention(
(out_proj): NonDynamicallyQuantizableLinear(in_features=768, out_features=768, bias=True)
)
(layernorm_output): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(layernorm_weightedPooing): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(layernorm_attnOut): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
)
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa3-s-CustomPoolin-toytest4-step1-checkpoints-tmp")
# Run inference
sentences = [
'what aspects of your environment may relate to the epidemic of obesity',
'Genetic changes in human populations occur too slowly to be responsible for the obesity epidemic. Nevertheless, the variation in how people respond to the environment that promotes physical inactivity and intake of high-calorie foods suggests that genes do play a role in the development of obesity.',
'When chemicals in solution react, the proper way of writing the chemical formulas of the dissolved ionic compounds is in terms of the dissociated ions, not the complete ionic formula. A complete ionic equation is a chemical equation in which the dissolved ionic compounds are written as separated ions. Solubility rules are very useful in determining which ionic compounds are dissolved and which are not. For example, when NaCl(aq) reacts with AgNO3(aq) in a double-replacement reaction to precipitate AgCl(s) and form NaNO3(aq), the complete ionic equation includes NaCl, AgNO3, and NaNO3 written as separated ions:.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Semantic Similarity
- Dataset:
sts-test
- Evaluated with
EmbeddingSimilarityEvaluator
Metric | Value |
---|---|
pearson_cosine | 0.5377 |
spearman_cosine | 0.541 |
pearson_manhattan | 0.5464 |
spearman_manhattan | 0.5401 |
pearson_euclidean | 0.547 |
spearman_euclidean | 0.541 |
pearson_dot | 0.5376 |
spearman_dot | 0.5409 |
pearson_max | 0.547 |
spearman_max | 0.541 |
Binary Classification
- Dataset:
allNLI-dev
- Evaluated with
BinaryClassificationEvaluator
Metric | Value |
---|---|
cosine_accuracy | 0.6836 |
cosine_accuracy_threshold | 0.9088 |
cosine_f1 | 0.5351 |
cosine_f1_threshold | 0.814 |
cosine_precision | 0.393 |
cosine_recall | 0.8382 |
cosine_ap | 0.4887 |
dot_accuracy | 0.6836 |
dot_accuracy_threshold | 699.0951 |
dot_f1 | 0.5351 |
dot_f1_threshold | 625.324 |
dot_precision | 0.393 |
dot_recall | 0.8382 |
dot_ap | 0.4889 |
manhattan_accuracy | 0.6836 |
manhattan_accuracy_threshold | 256.4548 |
manhattan_f1 | 0.5396 |
manhattan_f1_threshold | 339.2258 |
manhattan_precision | 0.4286 |
manhattan_recall | 0.7283 |
manhattan_ap | 0.492 |
euclidean_accuracy | 0.6836 |
euclidean_accuracy_threshold | 11.8348 |
euclidean_f1 | 0.5351 |
euclidean_f1_threshold | 16.9036 |
euclidean_precision | 0.393 |
euclidean_recall | 0.8382 |
euclidean_ap | 0.4887 |
max_accuracy | 0.6836 |
max_accuracy_threshold | 699.0951 |
max_f1 | 0.5396 |
max_f1_threshold | 625.324 |
max_precision | 0.4286 |
max_recall | 0.8382 |
max_ap | 0.492 |
Binary Classification
- Dataset:
Qnli-dev
- Evaluated with
BinaryClassificationEvaluator
Metric | Value |
---|---|
cosine_accuracy | 0.6934 |
cosine_accuracy_threshold | 0.8319 |
cosine_f1 | 0.6853 |
cosine_f1_threshold | 0.7455 |
cosine_precision | 0.5606 |
cosine_recall | 0.8814 |
cosine_ap | 0.6874 |
dot_accuracy | 0.6934 |
dot_accuracy_threshold | 639.0777 |
dot_f1 | 0.6853 |
dot_f1_threshold | 572.7137 |
dot_precision | 0.5606 |
dot_recall | 0.8814 |
dot_ap | 0.6879 |
manhattan_accuracy | 0.6992 |
manhattan_accuracy_threshold | 362.1486 |
manhattan_f1 | 0.6857 |
manhattan_f1_threshold | 430.3852 |
manhattan_precision | 0.5682 |
manhattan_recall | 0.8644 |
manhattan_ap | 0.6875 |
euclidean_accuracy | 0.6934 |
euclidean_accuracy_threshold | 16.0694 |
euclidean_f1 | 0.6853 |
euclidean_f1_threshold | 19.7729 |
euclidean_precision | 0.5606 |
euclidean_recall | 0.8814 |
euclidean_ap | 0.6874 |
max_accuracy | 0.6992 |
max_accuracy_threshold | 639.0777 |
max_f1 | 0.6857 |
max_f1_threshold | 572.7137 |
max_precision | 0.5682 |
max_recall | 0.8814 |
max_ap | 0.6879 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 32,500 training samples
- Columns:
sentence1
andsentence2
- Approximate statistics based on the first 1000 samples:
sentence1 sentence2 type string string details - min: 4 tokens
- mean: 29.39 tokens
- max: 323 tokens
- min: 2 tokens
- mean: 54.42 tokens
- max: 423 tokens
- Samples:
sentence1 sentence2 In which London road is Harrod’s department store?
Harrods, Brompton Road, London
e. in solids the atoms are closely locked in position and can only vibrate, in liquids the atoms and molecules are more loosely connected and can collide with and move past one another, while in gases the atoms or molecules are free to move independently, colliding frequently.
Within a substance, atoms that collide frequently and move independently of one another are most likely in a gas
Joe Cole was unable to join West Bromwich Albion .
On 16th October Joe Cole took a long hard look at himself realising that he would never get the opportunity to join West Bromwich Albion and joined Coventry City instead.
- Loss:
GISTEmbedLoss
with these parameters:{'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.025}
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 32per_device_eval_batch_size
: 256lr_scheduler_type
: cosine_with_min_lrlr_scheduler_kwargs
: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06}warmup_ratio
: 0.33save_safetensors
: Falsefp16
: Truepush_to_hub
: Truehub_model_id
: bobox/DeBERTa3-s-CustomPoolin-toytest4-step1-checkpoints-tmphub_strategy
: all_checkpointsbatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 32per_device_eval_batch_size
: 256per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: cosine_with_min_lrlr_scheduler_kwargs
: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06}warmup_ratio
: 0.33warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Falsesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Trueresume_from_checkpoint
: Nonehub_model_id
: bobox/DeBERTa3-s-CustomPoolin-toytest4-step1-checkpoints-tmphub_strategy
: all_checkpointshub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseeval_use_gather_object
: Falsebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | sts-test_spearman_cosine | allNLI-dev_max_ap | Qnli-dev_max_ap |
---|---|---|---|---|---|
0.0010 | 1 | 6.0688 | - | - | - |
0.0020 | 2 | 7.5576 | - | - | - |
0.0030 | 3 | 4.6849 | - | - | - |
0.0039 | 4 | 5.4503 | - | - | - |
0.0049 | 5 | 5.6057 | - | - | - |
0.0059 | 6 | 6.3049 | - | - | - |
0.0069 | 7 | 6.8336 | - | - | - |
0.0079 | 8 | 5.0777 | - | - | - |
0.0089 | 9 | 4.8358 | - | - | - |
0.0098 | 10 | 4.641 | - | - | - |
0.0108 | 11 | 4.828 | - | - | - |
0.0118 | 12 | 5.2269 | - | - | - |
0.0128 | 13 | 5.6772 | - | - | - |
0.0138 | 14 | 5.1422 | - | - | - |
0.0148 | 15 | 6.2469 | - | - | - |
0.0157 | 16 | 4.6802 | - | - | - |
0.0167 | 17 | 4.5492 | - | - | - |
0.0177 | 18 | 4.8062 | - | - | - |
0.0187 | 19 | 7.5141 | - | - | - |
0.0197 | 20 | 5.5202 | - | - | - |
0.0207 | 21 | 6.5025 | - | - | - |
0.0217 | 22 | 7.318 | - | - | - |
0.0226 | 23 | 4.6458 | - | - | - |
0.0236 | 24 | 4.6191 | - | - | - |
0.0246 | 25 | 4.3159 | - | - | - |
0.0256 | 26 | 6.3677 | - | - | - |
0.0266 | 27 | 5.6052 | - | - | - |
0.0276 | 28 | 4.196 | - | - | - |
0.0285 | 29 | 4.4802 | - | - | - |
0.0295 | 30 | 4.9193 | - | - | - |
0.0305 | 31 | 4.0996 | - | - | - |
0.0315 | 32 | 5.6307 | - | - | - |
0.0325 | 33 | 4.5745 | - | - | - |
0.0335 | 34 | 4.4514 | - | - | - |
0.0344 | 35 | 4.0617 | - | - | - |
0.0354 | 36 | 5.0298 | - | - | - |
0.0364 | 37 | 3.9815 | - | - | - |
0.0374 | 38 | 4.0871 | - | - | - |
0.0384 | 39 | 4.2378 | - | - | - |
0.0394 | 40 | 3.8226 | - | - | - |
0.0404 | 41 | 4.3519 | - | - | - |
0.0413 | 42 | 3.6345 | - | - | - |
0.0423 | 43 | 5.0829 | - | - | - |
0.0433 | 44 | 4.6701 | - | - | - |
0.0443 | 45 | 4.1371 | - | - | - |
0.0453 | 46 | 4.2418 | - | - | - |
0.0463 | 47 | 4.4766 | - | - | - |
0.0472 | 48 | 4.4797 | - | - | - |
0.0482 | 49 | 3.8471 | - | - | - |
0.0492 | 50 | 4.3194 | - | - | - |
0.0502 | 51 | 3.9426 | - | - | - |
0.0512 | 52 | 3.5333 | - | - | - |
0.0522 | 53 | 4.2426 | - | - | - |
0.0531 | 54 | 3.9816 | - | - | - |
0.0541 | 55 | 3.663 | - | - | - |
0.0551 | 56 | 3.9057 | - | - | - |
0.0561 | 57 | 4.0345 | - | - | - |
0.0571 | 58 | 3.5233 | - | - | - |
0.0581 | 59 | 3.7999 | - | - | - |
0.0591 | 60 | 3.1885 | - | - | - |
0.0600 | 61 | 3.6013 | - | - | - |
0.0610 | 62 | 3.392 | - | - | - |
0.0620 | 63 | 3.3814 | - | - | - |
0.0630 | 64 | 4.0428 | - | - | - |
0.0640 | 65 | 3.7825 | - | - | - |
0.0650 | 66 | 3.4181 | - | - | - |
0.0659 | 67 | 3.7793 | - | - | - |
0.0669 | 68 | 3.8344 | - | - | - |
0.0679 | 69 | 3.2165 | - | - | - |
0.0689 | 70 | 3.3811 | - | - | - |
0.0699 | 71 | 3.5984 | - | - | - |
0.0709 | 72 | 3.8583 | - | - | - |
0.0719 | 73 | 3.296 | - | - | - |
0.0728 | 74 | 2.7661 | - | - | - |
0.0738 | 75 | 2.9805 | - | - | - |
0.0748 | 76 | 2.566 | - | - | - |
0.0758 | 77 | 3.258 | - | - | - |
0.0768 | 78 | 3.3804 | - | - | - |
0.0778 | 79 | 2.8828 | - | - | - |
0.0787 | 80 | 3.1077 | - | - | - |
0.0797 | 81 | 2.9441 | - | - | - |
0.0807 | 82 | 2.9465 | - | - | - |
0.0817 | 83 | 2.7088 | - | - | - |
0.0827 | 84 | 2.9215 | - | - | - |
0.0837 | 85 | 3.4698 | - | - | - |
0.0846 | 86 | 2.2414 | - | - | - |
0.0856 | 87 | 3.1601 | - | - | - |
0.0866 | 88 | 2.7714 | - | - | - |
0.0876 | 89 | 3.0311 | - | - | - |
0.0886 | 90 | 3.0336 | - | - | - |
0.0896 | 91 | 1.9358 | - | - | - |
0.0906 | 92 | 2.6031 | - | - | - |
0.0915 | 93 | 2.7515 | - | - | - |
0.0925 | 94 | 2.8496 | - | - | - |
0.0935 | 95 | 1.8015 | - | - | - |
0.0945 | 96 | 2.8138 | - | - | - |
0.0955 | 97 | 2.0597 | - | - | - |
0.0965 | 98 | 2.1053 | - | - | - |
0.0974 | 99 | 2.6785 | - | - | - |
0.0984 | 100 | 2.588 | - | - | - |
0.0994 | 101 | 2.0099 | - | - | - |
0.1004 | 102 | 2.7947 | - | - | - |
0.1014 | 103 | 2.3274 | - | - | - |
0.1024 | 104 | 2.2545 | - | - | - |
0.1033 | 105 | 2.4575 | - | - | - |
0.1043 | 106 | 2.4413 | - | - | - |
0.1053 | 107 | 2.3185 | - | - | - |
0.1063 | 108 | 2.1577 | - | - | - |
0.1073 | 109 | 2.1278 | - | - | - |
0.1083 | 110 | 2.0967 | - | - | - |
0.1093 | 111 | 2.6142 | - | - | - |
0.1102 | 112 | 1.8553 | - | - | - |
0.1112 | 113 | 2.1523 | - | - | - |
0.1122 | 114 | 2.1726 | - | - | - |
0.1132 | 115 | 1.8564 | - | - | - |
0.1142 | 116 | 1.8413 | - | - | - |
0.1152 | 117 | 2.0441 | - | - | - |
0.1161 | 118 | 2.2159 | - | - | - |
0.1171 | 119 | 2.6779 | - | - | - |
0.1181 | 120 | 2.2976 | - | - | - |
0.1191 | 121 | 1.9407 | - | - | - |
0.1201 | 122 | 1.9019 | - | - | - |
0.1211 | 123 | 2.2149 | - | - | - |
0.1220 | 124 | 1.6823 | - | - | - |
0.1230 | 125 | 1.8402 | - | - | - |
0.1240 | 126 | 1.6914 | - | - | - |
0.125 | 127 | 2.1626 | - | - | - |
0.1260 | 128 | 1.6414 | - | - | - |
0.1270 | 129 | 2.2043 | - | - | - |
0.1280 | 130 | 1.9987 | - | - | - |
0.1289 | 131 | 1.8868 | - | - | - |
0.1299 | 132 | 1.8262 | - | - | - |
0.1309 | 133 | 2.0404 | - | - | - |
0.1319 | 134 | 1.9134 | - | - | - |
0.1329 | 135 | 2.3725 | - | - | - |
0.1339 | 136 | 1.4127 | - | - | - |
0.1348 | 137 | 1.6876 | - | - | - |
0.1358 | 138 | 1.8376 | - | - | - |
0.1368 | 139 | 1.6992 | - | - | - |
0.1378 | 140 | 1.5032 | - | - | - |
0.1388 | 141 | 2.0334 | - | - | - |
0.1398 | 142 | 2.3581 | - | - | - |
0.1407 | 143 | 1.4236 | - | - | - |
0.1417 | 144 | 2.202 | - | - | - |
0.1427 | 145 | 1.7654 | - | - | - |
0.1437 | 146 | 1.5748 | - | - | - |
0.1447 | 147 | 1.7996 | - | - | - |
0.1457 | 148 | 1.7517 | - | - | - |
0.1467 | 149 | 1.8933 | - | - | - |
0.1476 | 150 | 1.2836 | - | - | - |
0.1486 | 151 | 1.7145 | - | - | - |
0.1496 | 152 | 1.6499 | - | - | - |
0.1506 | 153 | 1.8273 | 0.4057 | 0.4389 | 0.6725 |
0.1516 | 154 | 2.2859 | - | - | - |
0.1526 | 155 | 1.0833 | - | - | - |
0.1535 | 156 | 1.6829 | - | - | - |
0.1545 | 157 | 2.1464 | - | - | - |
0.1555 | 158 | 1.745 | - | - | - |
0.1565 | 159 | 1.7319 | - | - | - |
0.1575 | 160 | 1.6968 | - | - | - |
0.1585 | 161 | 1.7401 | - | - | - |
0.1594 | 162 | 1.729 | - | - | - |
0.1604 | 163 | 2.0782 | - | - | - |
0.1614 | 164 | 2.6545 | - | - | - |
0.1624 | 165 | 1.4045 | - | - | - |
0.1634 | 166 | 1.2937 | - | - | - |
0.1644 | 167 | 1.1171 | - | - | - |
0.1654 | 168 | 1.3537 | - | - | - |
0.1663 | 169 | 1.7028 | - | - | - |
0.1673 | 170 | 1.4143 | - | - | - |
0.1683 | 171 | 1.8648 | - | - | - |
0.1693 | 172 | 1.6768 | - | - | - |
0.1703 | 173 | 1.9528 | - | - | - |
0.1713 | 174 | 1.1718 | - | - | - |
0.1722 | 175 | 1.8176 | - | - | - |
0.1732 | 176 | 0.8439 | - | - | - |
0.1742 | 177 | 1.5092 | - | - | - |
0.1752 | 178 | 1.1947 | - | - | - |
0.1762 | 179 | 1.6395 | - | - | - |
0.1772 | 180 | 1.4394 | - | - | - |
0.1781 | 181 | 1.7548 | - | - | - |
0.1791 | 182 | 1.1181 | - | - | - |
0.1801 | 183 | 1.0271 | - | - | - |
0.1811 | 184 | 2.3108 | - | - | - |
0.1821 | 185 | 2.1242 | - | - | - |
0.1831 | 186 | 1.9822 | - | - | - |
0.1841 | 187 | 2.3605 | - | - | - |
0.1850 | 188 | 1.5251 | - | - | - |
0.1860 | 189 | 1.2351 | - | - | - |
0.1870 | 190 | 1.5859 | - | - | - |
0.1880 | 191 | 1.8056 | - | - | - |
0.1890 | 192 | 1.349 | - | - | - |
0.1900 | 193 | 0.893 | - | - | - |
0.1909 | 194 | 1.5122 | - | - | - |
0.1919 | 195 | 1.3875 | - | - | - |
0.1929 | 196 | 1.29 | - | - | - |
0.1939 | 197 | 2.2931 | - | - | - |
0.1949 | 198 | 1.2663 | - | - | - |
0.1959 | 199 | 1.9712 | - | - | - |
0.1969 | 200 | 2.3307 | - | - | - |
0.1978 | 201 | 1.6544 | - | - | - |
0.1988 | 202 | 1.638 | - | - | - |
0.1998 | 203 | 1.3412 | - | - | - |
0.2008 | 204 | 1.4454 | - | - | - |
0.2018 | 205 | 1.5437 | - | - | - |
0.2028 | 206 | 1.4921 | - | - | - |
0.2037 | 207 | 1.4298 | - | - | - |
0.2047 | 208 | 1.6174 | - | - | - |
0.2057 | 209 | 1.4137 | - | - | - |
0.2067 | 210 | 1.5652 | - | - | - |
0.2077 | 211 | 1.1631 | - | - | - |
0.2087 | 212 | 1.2351 | - | - | - |
0.2096 | 213 | 1.7537 | - | - | - |
0.2106 | 214 | 1.3186 | - | - | - |
0.2116 | 215 | 1.2258 | - | - | - |
0.2126 | 216 | 0.7695 | - | - | - |
0.2136 | 217 | 1.2775 | - | - | - |
0.2146 | 218 | 1.6795 | - | - | - |
0.2156 | 219 | 1.2862 | - | - | - |
0.2165 | 220 | 1.1723 | - | - | - |
0.2175 | 221 | 1.3322 | - | - | - |
0.2185 | 222 | 1.7564 | - | - | - |
0.2195 | 223 | 1.1071 | - | - | - |
0.2205 | 224 | 1.2011 | - | - | - |
0.2215 | 225 | 1.2303 | - | - | - |
0.2224 | 226 | 1.212 | - | - | - |
0.2234 | 227 | 1.0117 | - | - | - |
0.2244 | 228 | 1.1907 | - | - | - |
0.2254 | 229 | 2.1293 | - | - | - |
0.2264 | 230 | 1.3063 | - | - | - |
0.2274 | 231 | 1.2841 | - | - | - |
0.2283 | 232 | 1.3778 | - | - | - |
0.2293 | 233 | 1.2242 | - | - | - |
0.2303 | 234 | 0.9227 | - | - | - |
0.2313 | 235 | 1.2221 | - | - | - |
0.2323 | 236 | 2.1041 | - | - | - |
0.2333 | 237 | 1.3341 | - | - | - |
0.2343 | 238 | 1.0876 | - | - | - |
0.2352 | 239 | 1.3328 | - | - | - |
0.2362 | 240 | 1.2958 | - | - | - |
0.2372 | 241 | 1.1522 | - | - | - |
0.2382 | 242 | 1.7942 | - | - | - |
0.2392 | 243 | 1.1325 | - | - | - |
0.2402 | 244 | 1.6466 | - | - | - |
0.2411 | 245 | 1.4608 | - | - | - |
0.2421 | 246 | 0.6375 | - | - | - |
0.2431 | 247 | 2.0177 | - | - | - |
0.2441 | 248 | 1.2069 | - | - | - |
0.2451 | 249 | 0.7639 | - | - | - |
0.2461 | 250 | 1.3465 | - | - | - |
0.2470 | 251 | 1.064 | - | - | - |
0.2480 | 252 | 1.3757 | - | - | - |
0.2490 | 253 | 1.612 | - | - | - |
0.25 | 254 | 0.7917 | - | - | - |
0.2510 | 255 | 1.5515 | - | - | - |
0.2520 | 256 | 0.799 | - | - | - |
0.2530 | 257 | 0.9882 | - | - | - |
0.2539 | 258 | 1.1814 | - | - | - |
0.2549 | 259 | 0.6394 | - | - | - |
0.2559 | 260 | 1.4756 | - | - | - |
0.2569 | 261 | 0.5338 | - | - | - |
0.2579 | 262 | 0.9779 | - | - | - |
0.2589 | 263 | 1.5307 | - | - | - |
0.2598 | 264 | 1.1213 | - | - | - |
0.2608 | 265 | 0.9482 | - | - | - |
0.2618 | 266 | 0.9599 | - | - | - |
0.2628 | 267 | 1.4455 | - | - | - |
0.2638 | 268 | 1.6496 | - | - | - |
0.2648 | 269 | 0.7402 | - | - | - |
0.2657 | 270 | 0.7835 | - | - | - |
0.2667 | 271 | 0.7821 | - | - | - |
0.2677 | 272 | 1.5422 | - | - | - |
0.2687 | 273 | 1.0995 | - | - | - |
0.2697 | 274 | 1.378 | - | - | - |
0.2707 | 275 | 1.3562 | - | - | - |
0.2717 | 276 | 0.7376 | - | - | - |
0.2726 | 277 | 1.1678 | - | - | - |
0.2736 | 278 | 1.2989 | - | - | - |
0.2746 | 279 | 1.9559 | - | - | - |
0.2756 | 280 | 1.1237 | - | - | - |
0.2766 | 281 | 0.952 | - | - | - |
0.2776 | 282 | 1.6629 | - | - | - |
0.2785 | 283 | 1.871 | - | - | - |
0.2795 | 284 | 1.5946 | - | - | - |
0.2805 | 285 | 1.4456 | - | - | - |
0.2815 | 286 | 1.4085 | - | - | - |
0.2825 | 287 | 1.1394 | - | - | - |
0.2835 | 288 | 1.0315 | - | - | - |
0.2844 | 289 | 1.488 | - | - | - |
0.2854 | 290 | 1.4006 | - | - | - |
0.2864 | 291 | 0.9237 | - | - | - |
0.2874 | 292 | 1.163 | - | - | - |
0.2884 | 293 | 1.7037 | - | - | - |
0.2894 | 294 | 0.8715 | - | - | - |
0.2904 | 295 | 1.2101 | - | - | - |
0.2913 | 296 | 1.1179 | - | - | - |
0.2923 | 297 | 1.3986 | - | - | - |
0.2933 | 298 | 1.7068 | - | - | - |
0.2943 | 299 | 0.8695 | - | - | - |
0.2953 | 300 | 1.3778 | - | - | - |
0.2963 | 301 | 1.2834 | - | - | - |
0.2972 | 302 | 0.8123 | - | - | - |
0.2982 | 303 | 1.6521 | - | - | - |
0.2992 | 304 | 1.1064 | - | - | - |
0.3002 | 305 | 0.9578 | - | - | - |
0.3012 | 306 | 0.9254 | 0.4888 | 0.4789 | 0.7040 |
0.3022 | 307 | 0.7541 | - | - | - |
0.3031 | 308 | 0.7324 | - | - | - |
0.3041 | 309 | 0.5974 | - | - | - |
0.3051 | 310 | 1.1481 | - | - | - |
0.3061 | 311 | 1.6179 | - | - | - |
0.3071 | 312 | 1.4641 | - | - | - |
0.3081 | 313 | 1.7185 | - | - | - |
0.3091 | 314 | 0.9328 | - | - | - |
0.3100 | 315 | 0.742 | - | - | - |
0.3110 | 316 | 1.4173 | - | - | - |
0.3120 | 317 | 0.7267 | - | - | - |
0.3130 | 318 | 0.9494 | - | - | - |
0.3140 | 319 | 1.5111 | - | - | - |
0.3150 | 320 | 1.6949 | - | - | - |
0.3159 | 321 | 1.7562 | - | - | - |
0.3169 | 322 | 1.2532 | - | - | - |
0.3179 | 323 | 1.1086 | - | - | - |
0.3189 | 324 | 0.7377 | - | - | - |
0.3199 | 325 | 1.085 | - | - | - |
0.3209 | 326 | 0.7767 | - | - | - |
0.3219 | 327 | 1.4441 | - | - | - |
0.3228 | 328 | 0.8146 | - | - | - |
0.3238 | 329 | 0.7403 | - | - | - |
0.3248 | 330 | 0.8476 | - | - | - |
0.3258 | 331 | 0.7323 | - | - | - |
0.3268 | 332 | 1.2241 | - | - | - |
0.3278 | 333 | 1.5065 | - | - | - |
0.3287 | 334 | 0.5259 | - | - | - |
0.3297 | 335 | 1.3103 | - | - | - |
0.3307 | 336 | 0.8655 | - | - | - |
0.3317 | 337 | 0.7575 | - | - | - |
0.3327 | 338 | 1.968 | - | - | - |
0.3337 | 339 | 1.317 | - | - | - |
0.3346 | 340 | 1.1972 | - | - | - |
0.3356 | 341 | 1.6323 | - | - | - |
0.3366 | 342 | 1.0469 | - | - | - |
0.3376 | 343 | 1.3349 | - | - | - |
0.3386 | 344 | 0.9544 | - | - | - |
0.3396 | 345 | 1.1894 | - | - | - |
0.3406 | 346 | 0.7717 | - | - | - |
0.3415 | 347 | 1.2563 | - | - | - |
0.3425 | 348 | 1.2437 | - | - | - |
0.3435 | 349 | 0.7806 | - | - | - |
0.3445 | 350 | 0.8303 | - | - | - |
0.3455 | 351 | 1.0926 | - | - | - |
0.3465 | 352 | 0.6654 | - | - | - |
0.3474 | 353 | 1.1087 | - | - | - |
0.3484 | 354 | 1.1525 | - | - | - |
0.3494 | 355 | 1.1127 | - | - | - |
0.3504 | 356 | 1.4267 | - | - | - |
0.3514 | 357 | 0.6148 | - | - | - |
0.3524 | 358 | 1.0123 | - | - | - |
0.3533 | 359 | 1.9682 | - | - | - |
0.3543 | 360 | 0.8487 | - | - | - |
0.3553 | 361 | 1.0412 | - | - | - |
0.3563 | 362 | 1.0902 | - | - | - |
0.3573 | 363 | 0.9606 | - | - | - |
0.3583 | 364 | 0.9206 | - | - | - |
0.3593 | 365 | 1.4727 | - | - | - |
0.3602 | 366 | 0.9379 | - | - | - |
0.3612 | 367 | 0.8387 | - | - | - |
0.3622 | 368 | 0.9692 | - | - | - |
0.3632 | 369 | 1.6298 | - | - | - |
0.3642 | 370 | 1.0882 | - | - | - |
0.3652 | 371 | 1.1558 | - | - | - |
0.3661 | 372 | 0.9546 | - | - | - |
0.3671 | 373 | 1.0124 | - | - | - |
0.3681 | 374 | 1.3916 | - | - | - |
0.3691 | 375 | 0.527 | - | - | - |
0.3701 | 376 | 0.6387 | - | - | - |
0.3711 | 377 | 1.1445 | - | - | - |
0.3720 | 378 | 1.3309 | - | - | - |
0.3730 | 379 | 1.5888 | - | - | - |
0.3740 | 380 | 1.4422 | - | - | - |
0.375 | 381 | 1.7044 | - | - | - |
0.3760 | 382 | 0.7913 | - | - | - |
0.3770 | 383 | 1.3241 | - | - | - |
0.3780 | 384 | 0.6473 | - | - | - |
0.3789 | 385 | 1.221 | - | - | - |
0.3799 | 386 | 0.7773 | - | - | - |
0.3809 | 387 | 1.054 | - | - | - |
0.3819 | 388 | 0.9862 | - | - | - |
0.3829 | 389 | 0.9684 | - | - | - |
0.3839 | 390 | 1.3244 | - | - | - |
0.3848 | 391 | 1.1787 | - | - | - |
0.3858 | 392 | 1.4698 | - | - | - |
0.3868 | 393 | 1.0961 | - | - | - |
0.3878 | 394 | 1.1364 | - | - | - |
0.3888 | 395 | 0.9368 | - | - | - |
0.3898 | 396 | 1.1731 | - | - | - |
0.3907 | 397 | 0.8686 | - | - | - |
0.3917 | 398 | 0.7481 | - | - | - |
0.3927 | 399 | 0.7261 | - | - | - |
0.3937 | 400 | 1.2062 | - | - | - |
0.3947 | 401 | 0.7462 | - | - | - |
0.3957 | 402 | 1.0318 | - | - | - |
0.3967 | 403 | 1.105 | - | - | - |
0.3976 | 404 | 1.009 | - | - | - |
0.3986 | 405 | 0.5941 | - | - | - |
0.3996 | 406 | 1.7972 | - | - | - |
0.4006 | 407 | 1.0544 | - | - | - |
0.4016 | 408 | 1.3912 | - | - | - |
0.4026 | 409 | 0.8305 | - | - | - |
0.4035 | 410 | 0.8688 | - | - | - |
0.4045 | 411 | 1.0069 | - | - | - |
0.4055 | 412 | 1.3141 | - | - | - |
0.4065 | 413 | 1.1042 | - | - | - |
0.4075 | 414 | 1.1011 | - | - | - |
0.4085 | 415 | 1.1192 | - | - | - |
0.4094 | 416 | 1.5957 | - | - | - |
0.4104 | 417 | 1.164 | - | - | - |
0.4114 | 418 | 0.6425 | - | - | - |
0.4124 | 419 | 0.6068 | - | - | - |
0.4134 | 420 | 0.9275 | - | - | - |
0.4144 | 421 | 0.8836 | - | - | - |
0.4154 | 422 | 1.2115 | - | - | - |
0.4163 | 423 | 0.8367 | - | - | - |
0.4173 | 424 | 1.0595 | - | - | - |
0.4183 | 425 | 0.826 | - | - | - |
0.4193 | 426 | 0.707 | - | - | - |
0.4203 | 427 | 0.6235 | - | - | - |
0.4213 | 428 | 0.7719 | - | - | - |
0.4222 | 429 | 1.0862 | - | - | - |
0.4232 | 430 | 0.9311 | - | - | - |
0.4242 | 431 | 1.2339 | - | - | - |
0.4252 | 432 | 0.9891 | - | - | - |
0.4262 | 433 | 1.8443 | - | - | - |
0.4272 | 434 | 1.1799 | - | - | - |
0.4281 | 435 | 0.759 | - | - | - |
0.4291 | 436 | 1.1002 | - | - | - |
0.4301 | 437 | 0.9141 | - | - | - |
0.4311 | 438 | 0.5467 | - | - | - |
0.4321 | 439 | 0.7476 | - | - | - |
0.4331 | 440 | 1.14 | - | - | - |
0.4341 | 441 | 1.1504 | - | - | - |
0.4350 | 442 | 1.26 | - | - | - |
0.4360 | 443 | 1.0311 | - | - | - |
0.4370 | 444 | 1.0646 | - | - | - |
0.4380 | 445 | 0.8687 | - | - | - |
0.4390 | 446 | 0.6839 | - | - | - |
0.4400 | 447 | 1.1376 | - | - | - |
0.4409 | 448 | 0.9759 | - | - | - |
0.4419 | 449 | 0.7971 | - | - | - |
0.4429 | 450 | 0.9708 | - | - | - |
0.4439 | 451 | 0.8217 | - | - | - |
0.4449 | 452 | 1.3728 | - | - | - |
0.4459 | 453 | 0.9119 | - | - | - |
0.4469 | 454 | 1.012 | - | - | - |
0.4478 | 455 | 1.3738 | - | - | - |
0.4488 | 456 | 0.8219 | - | - | - |
0.4498 | 457 | 1.2558 | - | - | - |
0.4508 | 458 | 0.6247 | - | - | - |
0.4518 | 459 | 0.7295 | 0.5410 | 0.4920 | 0.6879 |
0.4528 | 460 | 0.8154 | - | - | - |
0.4537 | 461 | 1.1392 | - | - | - |
0.4547 | 462 | 0.8618 | - | - | - |
0.4557 | 463 | 0.9669 | - | - | - |
0.4567 | 464 | 0.8804 | - | - | - |
0.4577 | 465 | 0.8479 | - | - | - |
0.4587 | 466 | 0.6296 | - | - | - |
0.4596 | 467 | 0.8449 | - | - | - |
0.4606 | 468 | 0.9772 | - | - | - |
0.4616 | 469 | 0.6424 | - | - | - |
0.4626 | 470 | 0.9169 | - | - | - |
0.4636 | 471 | 0.7599 | - | - | - |
0.4646 | 472 | 0.8943 | - | - | - |
0.4656 | 473 | 0.9475 | - | - | - |
0.4665 | 474 | 1.4518 | - | - | - |
0.4675 | 475 | 1.274 | - | - | - |
0.4685 | 476 | 0.7306 | - | - | - |
0.4695 | 477 | 0.9238 | - | - | - |
0.4705 | 478 | 0.6593 | - | - | - |
0.4715 | 479 | 1.0183 | - | - | - |
0.4724 | 480 | 1.2577 | - | - | - |
0.4734 | 481 | 0.8738 | - | - | - |
0.4744 | 482 | 1.1416 | - | - | - |
0.4754 | 483 | 0.7135 | - | - | - |
0.4764 | 484 | 1.2587 | - | - | - |
0.4774 | 485 | 0.8823 | - | - | - |
0.4783 | 486 | 0.8423 | - | - | - |
0.4793 | 487 | 0.7704 | - | - | - |
0.4803 | 488 | 0.7049 | - | - | - |
0.4813 | 489 | 1.1893 | - | - | - |
0.4823 | 490 | 1.3985 | - | - | - |
0.4833 | 491 | 1.3567 | - | - | - |
0.4843 | 492 | 1.2573 | - | - | - |
0.4852 | 493 | 0.7671 | - | - | - |
0.4862 | 494 | 0.5425 | - | - | - |
0.4872 | 495 | 0.9372 | - | - | - |
0.4882 | 496 | 0.799 | - | - | - |
0.4892 | 497 | 0.9548 | - | - | - |
0.4902 | 498 | 1.0855 | - | - | - |
0.4911 | 499 | 1.0465 | - | - | - |
0.4921 | 500 | 1.1004 | - | - | - |
0.4931 | 501 | 0.6392 | - | - | - |
0.4941 | 502 | 0.7102 | - | - | - |
0.4951 | 503 | 1.3242 | - | - | - |
0.4961 | 504 | 0.6861 | - | - | - |
0.4970 | 505 | 0.9291 | - | - | - |
0.4980 | 506 | 0.8592 | - | - | - |
0.4990 | 507 | 0.9462 | - | - | - |
0.5 | 508 | 1.0167 | - | - | - |
0.5010 | 509 | 1.0118 | - | - | - |
0.5020 | 510 | 0.6741 | - | - | - |
0.5030 | 511 | 1.4578 | - | - | - |
0.5039 | 512 | 1.2959 | - | - | - |
0.5049 | 513 | 0.8533 | - | - | - |
0.5059 | 514 | 0.6685 | - | - | - |
0.5069 | 515 | 1.1556 | - | - | - |
0.5079 | 516 | 0.8177 | - | - | - |
0.5089 | 517 | 0.6296 | - | - | - |
0.5098 | 518 | 0.8407 | - | - | - |
0.5108 | 519 | 0.6987 | - | - | - |
0.5118 | 520 | 0.9888 | - | - | - |
0.5128 | 521 | 0.8938 | - | - | - |
0.5138 | 522 | 0.582 | - | - | - |
0.5148 | 523 | 0.6596 | - | - | - |
0.5157 | 524 | 0.6029 | - | - | - |
0.5167 | 525 | 0.9806 | - | - | - |
0.5177 | 526 | 0.9463 | - | - | - |
0.5187 | 527 | 0.7088 | - | - | - |
0.5197 | 528 | 0.7525 | - | - | - |
0.5207 | 529 | 0.7625 | - | - | - |
0.5217 | 530 | 0.8271 | - | - | - |
0.5226 | 531 | 0.6129 | - | - | - |
0.5236 | 532 | 1.1563 | - | - | - |
0.5246 | 533 | 0.8131 | - | - | - |
0.5256 | 534 | 0.5363 | - | - | - |
0.5266 | 535 | 0.8819 | - | - | - |
0.5276 | 536 | 0.9772 | - | - | - |
0.5285 | 537 | 1.2102 | - | - | - |
0.5295 | 538 | 1.1234 | - | - | - |
0.5305 | 539 | 1.1857 | - | - | - |
0.5315 | 540 | 0.7873 | - | - | - |
0.5325 | 541 | 0.5034 | - | - | - |
0.5335 | 542 | 1.3305 | - | - | - |
0.5344 | 543 | 1.1727 | - | - | - |
0.5354 | 544 | 1.2825 | - | - | - |
0.5364 | 545 | 1.0446 | - | - | - |
0.5374 | 546 | 0.9838 | - | - | - |
0.5384 | 547 | 1.2194 | - | - | - |
0.5394 | 548 | 0.7709 | - | - | - |
0.5404 | 549 | 0.748 | - | - | - |
0.5413 | 550 | 1.0948 | - | - | - |
0.5423 | 551 | 0.915 | - | - | - |
0.5433 | 552 | 1.537 | - | - | - |
0.5443 | 553 | 0.3239 | - | - | - |
0.5453 | 554 | 0.9592 | - | - | - |
0.5463 | 555 | 0.7737 | - | - | - |
0.5472 | 556 | 0.613 | - | - | - |
0.5482 | 557 | 1.3646 | - | - | - |
0.5492 | 558 | 0.6659 | - | - | - |
0.5502 | 559 | 0.5207 | - | - | - |
0.5512 | 560 | 0.9467 | - | - | - |
0.5522 | 561 | 0.5692 | - | - | - |
0.5531 | 562 | 1.5855 | - | - | - |
0.5541 | 563 | 0.8855 | - | - | - |
0.5551 | 564 | 1.1829 | - | - | - |
0.5561 | 565 | 0.978 | - | - | - |
0.5571 | 566 | 1.1818 | - | - | - |
0.5581 | 567 | 0.701 | - | - | - |
0.5591 | 568 | 1.0226 | - | - | - |
0.5600 | 569 | 0.5937 | - | - | - |
0.5610 | 570 | 0.8095 | - | - | - |
0.5620 | 571 | 1.174 | - | - | - |
0.5630 | 572 | 0.96 | - | - | - |
0.5640 | 573 | 0.8339 | - | - | - |
0.5650 | 574 | 0.717 | - | - | - |
0.5659 | 575 | 0.5938 | - | - | - |
0.5669 | 576 | 0.6501 | - | - | - |
0.5679 | 577 | 0.7003 | - | - | - |
0.5689 | 578 | 0.5525 | - | - | - |
0.5699 | 579 | 0.7003 | - | - | - |
0.5709 | 580 | 1.059 | - | - | - |
0.5719 | 581 | 0.8625 | - | - | - |
0.5728 | 582 | 0.5862 | - | - | - |
0.5738 | 583 | 0.9162 | - | - | - |
0.5748 | 584 | 0.926 | - | - | - |
0.5758 | 585 | 1.2729 | - | - | - |
0.5768 | 586 | 0.8935 | - | - | - |
0.5778 | 587 | 0.541 | - | - | - |
0.5787 | 588 | 1.1455 | - | - | - |
0.5797 | 589 | 0.7306 | - | - | - |
0.5807 | 590 | 0.9088 | - | - | - |
0.5817 | 591 | 0.9166 | - | - | - |
0.5827 | 592 | 0.8679 | - | - | - |
0.5837 | 593 | 0.9329 | - | - | - |
0.5846 | 594 | 1.1201 | - | - | - |
0.5856 | 595 | 0.6418 | - | - | - |
0.5866 | 596 | 1.145 | - | - | - |
0.5876 | 597 | 1.4041 | - | - | - |
0.5886 | 598 | 0.6954 | - | - | - |
0.5896 | 599 | 0.4567 | - | - | - |
0.5906 | 600 | 1.1305 | - | - | - |
0.5915 | 601 | 0.8077 | - | - | - |
0.5925 | 602 | 0.6143 | - | - | - |
0.5935 | 603 | 1.3139 | - | - | - |
0.5945 | 604 | 0.7694 | - | - | - |
0.5955 | 605 | 0.9622 | - | - | - |
0.5965 | 606 | 0.91 | - | - | - |
0.5974 | 607 | 1.3125 | - | - | - |
0.5984 | 608 | 1.0153 | - | - | - |
0.5994 | 609 | 0.8468 | - | - | - |
0.6004 | 610 | 1.1026 | - | - | - |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.5.0+cu121
- Accelerate: 0.34.2
- Datasets: 3.0.2
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
GISTEmbedLoss
@misc{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
year={2024},
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}