bobox's picture
Training in progress, step 1830, checkpoint
b478e68 verified
metadata
base_model: microsoft/deberta-v3-small
library_name: sentence-transformers
metrics:
  - pearson_cosine
  - spearman_cosine
  - pearson_manhattan
  - spearman_manhattan
  - pearson_euclidean
  - spearman_euclidean
  - pearson_dot
  - spearman_dot
  - pearson_max
  - spearman_max
  - cosine_accuracy
  - cosine_accuracy_threshold
  - cosine_f1
  - cosine_f1_threshold
  - cosine_precision
  - cosine_recall
  - cosine_ap
  - dot_accuracy
  - dot_accuracy_threshold
  - dot_f1
  - dot_f1_threshold
  - dot_precision
  - dot_recall
  - dot_ap
  - manhattan_accuracy
  - manhattan_accuracy_threshold
  - manhattan_f1
  - manhattan_f1_threshold
  - manhattan_precision
  - manhattan_recall
  - manhattan_ap
  - euclidean_accuracy
  - euclidean_accuracy_threshold
  - euclidean_f1
  - euclidean_f1_threshold
  - euclidean_precision
  - euclidean_recall
  - euclidean_ap
  - max_accuracy
  - max_accuracy_threshold
  - max_f1
  - max_f1_threshold
  - max_precision
  - max_recall
  - max_ap
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:32500
  - loss:GISTEmbedLoss
widget:
  - source_sentence: phase changes do not change
    sentences:
      - >-
        The major Atlantic slave trading nations, ordered by trade volume, were
        the Portuguese, the British, the Spanish, the French, the Dutch, and the
        Danish. Several had established outposts on the African coast where they
        purchased slaves from local African leaders.
      - >-
        phase changes do not change mass. Particles have mass, but mass is
        energy. 
         phase changes do not change  energy
      - >-
        According to the U.S. Census Bureau , the county is a total area of ,
        which has land and ( 0.2 % ) is water .
  - source_sentence: what jobs can you get with a bachelor degree in anthropology?
    sentences:
      - >-
        To determine the atomic weight of an element, you should add up protons
        and neutrons.
      - >-
        ['Paleontologist*', 'Archaeologist*', 'University Professor*', 'Market
        Research Analyst*', 'Primatologist.', 'Forensic Scientist*', 'Medical
        Anthropologist.', 'Museum Technician.']
      - >-
        The wingspan flies , the moth comes depending on the location from July
        to August .
  - source_sentence: Identify different forms of energy (e.g., light, sound, heat).
    sentences:
      - >-
        `` Irreplaceable '' '' remained on the chart for thirty weeks , and was
        certified double-platinum by the Recording Industry Association of
        America ( RIAA ) , denoting sales of two million downloads , and had
        sold over 3,139,000 paid digital downloads in the US as of October 2012
        , according to Nielsen SoundScan . ''
      - >-
        On Rotten Tomatoes , the film has a rating of 63 % , based on 87 reviews
        , with an average rating of 5.9/10 .
      - Heat, light, and sound are all different forms of energy.
  - source_sentence: what is so small it can only be seen with an electron microscope?
    sentences:
      - >-
        Viruses are so small that they can be seen only with an electron
        microscope.. Where most viruses are DNA, HIV is an RNA virus. 
         HIV is so small it can only be seen with an electron microscope
      - >-
        The development of modern lasers has opened many doors to both research
        and applications. A laser beam was used to measure the distance from the
        Earth to the moon. Lasers are important components of CD players. As the
        image above illustrates, lasers can provide precise focusing of beams to
        selectively destroy cancer cells in patients. The ability of a laser to
        focus precisely is due to high-quality crystals that help give rise to
        the laser beam. A variety of techniques are used to manufacture pure
        crystals for use in lasers.
      - >-
        Discussion for (a) This value is the net work done on the package. The
        person actually does more work than this, because friction opposes the
        motion. Friction does negative work and removes some of the energy the
        person expends and converts it to thermal energy. The net work equals
        the sum of the work done by each individual force. Strategy and Concept
        for (b) The forces acting on the package are gravity, the normal force,
        the force of friction, and the applied force. The normal force and force
        of gravity are each perpendicular to the displacement, and therefore do
        no work. Solution for (b) The applied force does work.
  - source_sentence: what aspects of your environment may relate to the epidemic of obesity
    sentences:
      - >-
        Jan Kromkamp ( born August 17 , 1980 in Makkinga , Netherlands ) is a
        Dutch footballer .
      - >-
        When chemicals in solution react, the proper way of writing the chemical
        formulas of the dissolved ionic compounds is in terms of the dissociated
        ions, not the complete ionic formula. A complete ionic equation is a
        chemical equation in which the dissolved ionic compounds are written as
        separated ions. Solubility rules are very useful in determining which
        ionic compounds are dissolved and which are not. For example, when
        NaCl(aq) reacts with AgNO3(aq) in a double-replacement reaction to
        precipitate AgCl(s) and form NaNO3(aq), the complete ionic equation
        includes NaCl, AgNO3, and NaNO3 written as separated ions:.
      - >-
        Genetic changes in human populations occur too slowly to be responsible
        for the obesity epidemic. Nevertheless, the variation in how people
        respond to the environment that promotes physical inactivity and intake
        of high-calorie foods suggests that genes do play a role in the
        development of obesity.
model-index:
  - name: SentenceTransformer based on microsoft/deberta-v3-small
    results:
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: sts test
          type: sts-test
        metrics:
          - type: pearson_cosine
            value: 0.6764054032395426
            name: Pearson Cosine
          - type: spearman_cosine
            value: 0.6929200286447658
            name: Spearman Cosine
          - type: pearson_manhattan
            value: 0.700820653979197
            name: Pearson Manhattan
          - type: spearman_manhattan
            value: 0.6972355270345225
            name: Spearman Manhattan
          - type: pearson_euclidean
            value: 0.6974498133985872
            name: Pearson Euclidean
          - type: spearman_euclidean
            value: 0.6929374816367083
            name: Spearman Euclidean
          - type: pearson_dot
            value: 0.676046362499036
            name: Pearson Dot
          - type: spearman_dot
            value: 0.692721109287881
            name: Spearman Dot
          - type: pearson_max
            value: 0.700820653979197
            name: Pearson Max
          - type: spearman_max
            value: 0.6972355270345225
            name: Spearman Max
      - task:
          type: binary-classification
          name: Binary Classification
        dataset:
          name: allNLI dev
          type: allNLI-dev
        metrics:
          - type: cosine_accuracy
            value: 0.697265625
            name: Cosine Accuracy
          - type: cosine_accuracy_threshold
            value: 0.9135359525680542
            name: Cosine Accuracy Threshold
          - type: cosine_f1
            value: 0.5375854214123007
            name: Cosine F1
          - type: cosine_f1_threshold
            value: 0.8361715078353882
            name: Cosine F1 Threshold
          - type: cosine_precision
            value: 0.44360902255639095
            name: Cosine Precision
          - type: cosine_recall
            value: 0.6820809248554913
            name: Cosine Recall
          - type: cosine_ap
            value: 0.5215030688158616
            name: Cosine Ap
          - type: dot_accuracy
            value: 0.697265625
            name: Dot Accuracy
          - type: dot_accuracy_threshold
            value: 702.328125
            name: Dot Accuracy Threshold
          - type: dot_f1
            value: 0.5375854214123007
            name: Dot F1
          - type: dot_f1_threshold
            value: 642.5977783203125
            name: Dot F1 Threshold
          - type: dot_precision
            value: 0.44360902255639095
            name: Dot Precision
          - type: dot_recall
            value: 0.6820809248554913
            name: Dot Recall
          - type: dot_ap
            value: 0.5215320356910601
            name: Dot Ap
          - type: manhattan_accuracy
            value: 0.705078125
            name: Manhattan Accuracy
          - type: manhattan_accuracy_threshold
            value: 246.65103149414062
            name: Manhattan Accuracy Threshold
          - type: manhattan_f1
            value: 0.5364705882352941
            name: Manhattan F1
          - type: manhattan_f1_threshold
            value: 332.9952392578125
            name: Manhattan F1 Threshold
          - type: manhattan_precision
            value: 0.4523809523809524
            name: Manhattan Precision
          - type: manhattan_recall
            value: 0.6589595375722543
            name: Manhattan Recall
          - type: manhattan_ap
            value: 0.5262241457144412
            name: Manhattan Ap
          - type: euclidean_accuracy
            value: 0.697265625
            name: Euclidean Accuracy
          - type: euclidean_accuracy_threshold
            value: 11.530282020568848
            name: Euclidean Accuracy Threshold
          - type: euclidean_f1
            value: 0.5375854214123007
            name: Euclidean F1
          - type: euclidean_f1_threshold
            value: 15.868341445922852
            name: Euclidean F1 Threshold
          - type: euclidean_precision
            value: 0.44360902255639095
            name: Euclidean Precision
          - type: euclidean_recall
            value: 0.6820809248554913
            name: Euclidean Recall
          - type: euclidean_ap
            value: 0.5215763814962693
            name: Euclidean Ap
          - type: max_accuracy
            value: 0.705078125
            name: Max Accuracy
          - type: max_accuracy_threshold
            value: 702.328125
            name: Max Accuracy Threshold
          - type: max_f1
            value: 0.5375854214123007
            name: Max F1
          - type: max_f1_threshold
            value: 642.5977783203125
            name: Max F1 Threshold
          - type: max_precision
            value: 0.4523809523809524
            name: Max Precision
          - type: max_recall
            value: 0.6820809248554913
            name: Max Recall
          - type: max_ap
            value: 0.5262241457144412
            name: Max Ap
      - task:
          type: binary-classification
          name: Binary Classification
        dataset:
          name: Qnli dev
          type: Qnli-dev
        metrics:
          - type: cosine_accuracy
            value: 0.6796875
            name: Cosine Accuracy
          - type: cosine_accuracy_threshold
            value: 0.8366684317588806
            name: Cosine Accuracy Threshold
          - type: cosine_f1
            value: 0.6749611197511665
            name: Cosine F1
          - type: cosine_f1_threshold
            value: 0.7199565172195435
            name: Cosine F1 Threshold
          - type: cosine_precision
            value: 0.5331695331695332
            name: Cosine Precision
          - type: cosine_recall
            value: 0.9194915254237288
            name: Cosine Recall
          - type: cosine_ap
            value: 0.7025711036957332
            name: Cosine Ap
          - type: dot_accuracy
            value: 0.6796875
            name: Dot Accuracy
          - type: dot_accuracy_threshold
            value: 643.223388671875
            name: Dot Accuracy Threshold
          - type: dot_f1
            value: 0.673913043478261
            name: Dot F1
          - type: dot_f1_threshold
            value: 552.9798583984375
            name: Dot F1 Threshold
          - type: dot_precision
            value: 0.5318627450980392
            name: Dot Precision
          - type: dot_recall
            value: 0.9194915254237288
            name: Dot Recall
          - type: dot_ap
            value: 0.7028877754827524
            name: Dot Ap
          - type: manhattan_accuracy
            value: 0.677734375
            name: Manhattan Accuracy
          - type: manhattan_accuracy_threshold
            value: 335.95025634765625
            name: Manhattan Accuracy Threshold
          - type: manhattan_f1
            value: 0.6774716369529984
            name: Manhattan F1
          - type: manhattan_f1_threshold
            value: 431.2140808105469
            name: Manhattan F1 Threshold
          - type: manhattan_precision
            value: 0.5485564304461942
            name: Manhattan Precision
          - type: manhattan_recall
            value: 0.885593220338983
            name: Manhattan Recall
          - type: manhattan_ap
            value: 0.7026678687600421
            name: Manhattan Ap
          - type: euclidean_accuracy
            value: 0.6796875
            name: Euclidean Accuracy
          - type: euclidean_accuracy_threshold
            value: 15.844499588012695
            name: Euclidean Accuracy Threshold
          - type: euclidean_f1
            value: 0.6749611197511665
            name: Euclidean F1
          - type: euclidean_f1_threshold
            value: 20.740978240966797
            name: Euclidean F1 Threshold
          - type: euclidean_precision
            value: 0.5331695331695332
            name: Euclidean Precision
          - type: euclidean_recall
            value: 0.9194915254237288
            name: Euclidean Recall
          - type: euclidean_ap
            value: 0.7026118478627446
            name: Euclidean Ap
          - type: max_accuracy
            value: 0.6796875
            name: Max Accuracy
          - type: max_accuracy_threshold
            value: 643.223388671875
            name: Max Accuracy Threshold
          - type: max_f1
            value: 0.6774716369529984
            name: Max F1
          - type: max_f1_threshold
            value: 552.9798583984375
            name: Max F1 Threshold
          - type: max_precision
            value: 0.5485564304461942
            name: Max Precision
          - type: max_recall
            value: 0.9194915254237288
            name: Max Recall
          - type: max_ap
            value: 0.7028877754827524
            name: Max Ap

SentenceTransformer based on microsoft/deberta-v3-small

This is a sentence-transformers model finetuned from microsoft/deberta-v3-small. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: microsoft/deberta-v3-small
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model 
  (1): AdvancedWeightedPooling(
    (alpha_dropout_layer): Dropout(p=0.05, inplace=False)
    (gate_dropout_layer): Dropout(p=0.0, inplace=False)
    (linear_cls_Qpj): Linear(in_features=768, out_features=768, bias=True)
    (linear_attnOut): Linear(in_features=768, out_features=768, bias=True)
    (mha): MultiheadAttention(
      (out_proj): NonDynamicallyQuantizableLinear(in_features=768, out_features=768, bias=True)
    )
    (layernorm_output): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
    (layernorm_weightedPooing): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
    (layernorm_attnOut): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
  )
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa3-s-CustomPoolin-toytest4-step1-checkpoints-tmp")
# Run inference
sentences = [
    'what aspects of your environment may relate to the epidemic of obesity',
    'Genetic changes in human populations occur too slowly to be responsible for the obesity epidemic. Nevertheless, the variation in how people respond to the environment that promotes physical inactivity and intake of high-calorie foods suggests that genes do play a role in the development of obesity.',
    'When chemicals in solution react, the proper way of writing the chemical formulas of the dissolved ionic compounds is in terms of the dissociated ions, not the complete ionic formula. A complete ionic equation is a chemical equation in which the dissolved ionic compounds are written as separated ions. Solubility rules are very useful in determining which ionic compounds are dissolved and which are not. For example, when NaCl(aq) reacts with AgNO3(aq) in a double-replacement reaction to precipitate AgCl(s) and form NaNO3(aq), the complete ionic equation includes NaCl, AgNO3, and NaNO3 written as separated ions:.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric Value
pearson_cosine 0.6764
spearman_cosine 0.6929
pearson_manhattan 0.7008
spearman_manhattan 0.6972
pearson_euclidean 0.6974
spearman_euclidean 0.6929
pearson_dot 0.676
spearman_dot 0.6927
pearson_max 0.7008
spearman_max 0.6972

Binary Classification

Metric Value
cosine_accuracy 0.6973
cosine_accuracy_threshold 0.9135
cosine_f1 0.5376
cosine_f1_threshold 0.8362
cosine_precision 0.4436
cosine_recall 0.6821
cosine_ap 0.5215
dot_accuracy 0.6973
dot_accuracy_threshold 702.3281
dot_f1 0.5376
dot_f1_threshold 642.5978
dot_precision 0.4436
dot_recall 0.6821
dot_ap 0.5215
manhattan_accuracy 0.7051
manhattan_accuracy_threshold 246.651
manhattan_f1 0.5365
manhattan_f1_threshold 332.9952
manhattan_precision 0.4524
manhattan_recall 0.659
manhattan_ap 0.5262
euclidean_accuracy 0.6973
euclidean_accuracy_threshold 11.5303
euclidean_f1 0.5376
euclidean_f1_threshold 15.8683
euclidean_precision 0.4436
euclidean_recall 0.6821
euclidean_ap 0.5216
max_accuracy 0.7051
max_accuracy_threshold 702.3281
max_f1 0.5376
max_f1_threshold 642.5978
max_precision 0.4524
max_recall 0.6821
max_ap 0.5262

Binary Classification

Metric Value
cosine_accuracy 0.6797
cosine_accuracy_threshold 0.8367
cosine_f1 0.675
cosine_f1_threshold 0.72
cosine_precision 0.5332
cosine_recall 0.9195
cosine_ap 0.7026
dot_accuracy 0.6797
dot_accuracy_threshold 643.2234
dot_f1 0.6739
dot_f1_threshold 552.9799
dot_precision 0.5319
dot_recall 0.9195
dot_ap 0.7029
manhattan_accuracy 0.6777
manhattan_accuracy_threshold 335.9503
manhattan_f1 0.6775
manhattan_f1_threshold 431.2141
manhattan_precision 0.5486
manhattan_recall 0.8856
manhattan_ap 0.7027
euclidean_accuracy 0.6797
euclidean_accuracy_threshold 15.8445
euclidean_f1 0.675
euclidean_f1_threshold 20.741
euclidean_precision 0.5332
euclidean_recall 0.9195
euclidean_ap 0.7026
max_accuracy 0.6797
max_accuracy_threshold 643.2234
max_f1 0.6775
max_f1_threshold 552.9799
max_precision 0.5486
max_recall 0.9195
max_ap 0.7029

Training Details

Training Dataset

Unnamed Dataset

  • Size: 32,500 training samples
  • Columns: sentence1 and sentence2
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2
    type string string
    details
    • min: 4 tokens
    • mean: 29.39 tokens
    • max: 323 tokens
    • min: 2 tokens
    • mean: 54.42 tokens
    • max: 423 tokens
  • Samples:
    sentence1 sentence2
    In which London road is Harrod’s department store? Harrods, Brompton Road, London
    e. in solids the atoms are closely locked in position and can only vibrate, in liquids the atoms and molecules are more loosely connected and can collide with and move past one another, while in gases the atoms or molecules are free to move independently, colliding frequently. Within a substance, atoms that collide frequently and move independently of one another are most likely in a gas
    Joe Cole was unable to join West Bromwich Albion . On 16th October Joe Cole took a long hard look at himself realising that he would never get the opportunity to join West Bromwich Albion and joined Coventry City instead.
  • Loss: GISTEmbedLoss with these parameters:
    {'guide': SentenceTransformer(
      (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
      (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
      (2): Normalize()
    ), 'temperature': 0.025}
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 256
  • lr_scheduler_type: cosine_with_min_lr
  • lr_scheduler_kwargs: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06}
  • warmup_ratio: 0.33
  • save_safetensors: False
  • fp16: True
  • push_to_hub: True
  • hub_model_id: bobox/DeBERTa3-s-CustomPoolin-toytest4-step1-checkpoints-tmp
  • hub_strategy: all_checkpoints
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 256
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: cosine_with_min_lr
  • lr_scheduler_kwargs: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06}
  • warmup_ratio: 0.33
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: False
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: True
  • resume_from_checkpoint: None
  • hub_model_id: bobox/DeBERTa3-s-CustomPoolin-toytest4-step1-checkpoints-tmp
  • hub_strategy: all_checkpoints
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss sts-test_spearman_cosine allNLI-dev_max_ap Qnli-dev_max_ap
0.0010 1 6.0688 - - -
0.0020 2 7.5576 - - -
0.0030 3 4.6849 - - -
0.0039 4 5.4503 - - -
0.0049 5 5.6057 - - -
0.0059 6 6.3049 - - -
0.0069 7 6.8336 - - -
0.0079 8 5.0777 - - -
0.0089 9 4.8358 - - -
0.0098 10 4.641 - - -
0.0108 11 4.828 - - -
0.0118 12 5.2269 - - -
0.0128 13 5.6772 - - -
0.0138 14 5.1422 - - -
0.0148 15 6.2469 - - -
0.0157 16 4.6802 - - -
0.0167 17 4.5492 - - -
0.0177 18 4.8062 - - -
0.0187 19 7.5141 - - -
0.0197 20 5.5202 - - -
0.0207 21 6.5025 - - -
0.0217 22 7.318 - - -
0.0226 23 4.6458 - - -
0.0236 24 4.6191 - - -
0.0246 25 4.3159 - - -
0.0256 26 6.3677 - - -
0.0266 27 5.6052 - - -
0.0276 28 4.196 - - -
0.0285 29 4.4802 - - -
0.0295 30 4.9193 - - -
0.0305 31 4.0996 - - -
0.0315 32 5.6307 - - -
0.0325 33 4.5745 - - -
0.0335 34 4.4514 - - -
0.0344 35 4.0617 - - -
0.0354 36 5.0298 - - -
0.0364 37 3.9815 - - -
0.0374 38 4.0871 - - -
0.0384 39 4.2378 - - -
0.0394 40 3.8226 - - -
0.0404 41 4.3519 - - -
0.0413 42 3.6345 - - -
0.0423 43 5.0829 - - -
0.0433 44 4.6701 - - -
0.0443 45 4.1371 - - -
0.0453 46 4.2418 - - -
0.0463 47 4.4766 - - -
0.0472 48 4.4797 - - -
0.0482 49 3.8471 - - -
0.0492 50 4.3194 - - -
0.0502 51 3.9426 - - -
0.0512 52 3.5333 - - -
0.0522 53 4.2426 - - -
0.0531 54 3.9816 - - -
0.0541 55 3.663 - - -
0.0551 56 3.9057 - - -
0.0561 57 4.0345 - - -
0.0571 58 3.5233 - - -
0.0581 59 3.7999 - - -
0.0591 60 3.1885 - - -
0.0600 61 3.6013 - - -
0.0610 62 3.392 - - -
0.0620 63 3.3814 - - -
0.0630 64 4.0428 - - -
0.0640 65 3.7825 - - -
0.0650 66 3.4181 - - -
0.0659 67 3.7793 - - -
0.0669 68 3.8344 - - -
0.0679 69 3.2165 - - -
0.0689 70 3.3811 - - -
0.0699 71 3.5984 - - -
0.0709 72 3.8583 - - -
0.0719 73 3.296 - - -
0.0728 74 2.7661 - - -
0.0738 75 2.9805 - - -
0.0748 76 2.566 - - -
0.0758 77 3.258 - - -
0.0768 78 3.3804 - - -
0.0778 79 2.8828 - - -
0.0787 80 3.1077 - - -
0.0797 81 2.9441 - - -
0.0807 82 2.9465 - - -
0.0817 83 2.7088 - - -
0.0827 84 2.9215 - - -
0.0837 85 3.4698 - - -
0.0846 86 2.2414 - - -
0.0856 87 3.1601 - - -
0.0866 88 2.7714 - - -
0.0876 89 3.0311 - - -
0.0886 90 3.0336 - - -
0.0896 91 1.9358 - - -
0.0906 92 2.6031 - - -
0.0915 93 2.7515 - - -
0.0925 94 2.8496 - - -
0.0935 95 1.8015 - - -
0.0945 96 2.8138 - - -
0.0955 97 2.0597 - - -
0.0965 98 2.1053 - - -
0.0974 99 2.6785 - - -
0.0984 100 2.588 - - -
0.0994 101 2.0099 - - -
0.1004 102 2.7947 - - -
0.1014 103 2.3274 - - -
0.1024 104 2.2545 - - -
0.1033 105 2.4575 - - -
0.1043 106 2.4413 - - -
0.1053 107 2.3185 - - -
0.1063 108 2.1577 - - -
0.1073 109 2.1278 - - -
0.1083 110 2.0967 - - -
0.1093 111 2.6142 - - -
0.1102 112 1.8553 - - -
0.1112 113 2.1523 - - -
0.1122 114 2.1726 - - -
0.1132 115 1.8564 - - -
0.1142 116 1.8413 - - -
0.1152 117 2.0441 - - -
0.1161 118 2.2159 - - -
0.1171 119 2.6779 - - -
0.1181 120 2.2976 - - -
0.1191 121 1.9407 - - -
0.1201 122 1.9019 - - -
0.1211 123 2.2149 - - -
0.1220 124 1.6823 - - -
0.1230 125 1.8402 - - -
0.1240 126 1.6914 - - -
0.125 127 2.1626 - - -
0.1260 128 1.6414 - - -
0.1270 129 2.2043 - - -
0.1280 130 1.9987 - - -
0.1289 131 1.8868 - - -
0.1299 132 1.8262 - - -
0.1309 133 2.0404 - - -
0.1319 134 1.9134 - - -
0.1329 135 2.3725 - - -
0.1339 136 1.4127 - - -
0.1348 137 1.6876 - - -
0.1358 138 1.8376 - - -
0.1368 139 1.6992 - - -
0.1378 140 1.5032 - - -
0.1388 141 2.0334 - - -
0.1398 142 2.3581 - - -
0.1407 143 1.4236 - - -
0.1417 144 2.202 - - -
0.1427 145 1.7654 - - -
0.1437 146 1.5748 - - -
0.1447 147 1.7996 - - -
0.1457 148 1.7517 - - -
0.1467 149 1.8933 - - -
0.1476 150 1.2836 - - -
0.1486 151 1.7145 - - -
0.1496 152 1.6499 - - -
0.1506 153 1.8273 0.4057 0.4389 0.6725
0.1516 154 2.2859 - - -
0.1526 155 1.0833 - - -
0.1535 156 1.6829 - - -
0.1545 157 2.1464 - - -
0.1555 158 1.745 - - -
0.1565 159 1.7319 - - -
0.1575 160 1.6968 - - -
0.1585 161 1.7401 - - -
0.1594 162 1.729 - - -
0.1604 163 2.0782 - - -
0.1614 164 2.6545 - - -
0.1624 165 1.4045 - - -
0.1634 166 1.2937 - - -
0.1644 167 1.1171 - - -
0.1654 168 1.3537 - - -
0.1663 169 1.7028 - - -
0.1673 170 1.4143 - - -
0.1683 171 1.8648 - - -
0.1693 172 1.6768 - - -
0.1703 173 1.9528 - - -
0.1713 174 1.1718 - - -
0.1722 175 1.8176 - - -
0.1732 176 0.8439 - - -
0.1742 177 1.5092 - - -
0.1752 178 1.1947 - - -
0.1762 179 1.6395 - - -
0.1772 180 1.4394 - - -
0.1781 181 1.7548 - - -
0.1791 182 1.1181 - - -
0.1801 183 1.0271 - - -
0.1811 184 2.3108 - - -
0.1821 185 2.1242 - - -
0.1831 186 1.9822 - - -
0.1841 187 2.3605 - - -
0.1850 188 1.5251 - - -
0.1860 189 1.2351 - - -
0.1870 190 1.5859 - - -
0.1880 191 1.8056 - - -
0.1890 192 1.349 - - -
0.1900 193 0.893 - - -
0.1909 194 1.5122 - - -
0.1919 195 1.3875 - - -
0.1929 196 1.29 - - -
0.1939 197 2.2931 - - -
0.1949 198 1.2663 - - -
0.1959 199 1.9712 - - -
0.1969 200 2.3307 - - -
0.1978 201 1.6544 - - -
0.1988 202 1.638 - - -
0.1998 203 1.3412 - - -
0.2008 204 1.4454 - - -
0.2018 205 1.5437 - - -
0.2028 206 1.4921 - - -
0.2037 207 1.4298 - - -
0.2047 208 1.6174 - - -
0.2057 209 1.4137 - - -
0.2067 210 1.5652 - - -
0.2077 211 1.1631 - - -
0.2087 212 1.2351 - - -
0.2096 213 1.7537 - - -
0.2106 214 1.3186 - - -
0.2116 215 1.2258 - - -
0.2126 216 0.7695 - - -
0.2136 217 1.2775 - - -
0.2146 218 1.6795 - - -
0.2156 219 1.2862 - - -
0.2165 220 1.1723 - - -
0.2175 221 1.3322 - - -
0.2185 222 1.7564 - - -
0.2195 223 1.1071 - - -
0.2205 224 1.2011 - - -
0.2215 225 1.2303 - - -
0.2224 226 1.212 - - -
0.2234 227 1.0117 - - -
0.2244 228 1.1907 - - -
0.2254 229 2.1293 - - -
0.2264 230 1.3063 - - -
0.2274 231 1.2841 - - -
0.2283 232 1.3778 - - -
0.2293 233 1.2242 - - -
0.2303 234 0.9227 - - -
0.2313 235 1.2221 - - -
0.2323 236 2.1041 - - -
0.2333 237 1.3341 - - -
0.2343 238 1.0876 - - -
0.2352 239 1.3328 - - -
0.2362 240 1.2958 - - -
0.2372 241 1.1522 - - -
0.2382 242 1.7942 - - -
0.2392 243 1.1325 - - -
0.2402 244 1.6466 - - -
0.2411 245 1.4608 - - -
0.2421 246 0.6375 - - -
0.2431 247 2.0177 - - -
0.2441 248 1.2069 - - -
0.2451 249 0.7639 - - -
0.2461 250 1.3465 - - -
0.2470 251 1.064 - - -
0.2480 252 1.3757 - - -
0.2490 253 1.612 - - -
0.25 254 0.7917 - - -
0.2510 255 1.5515 - - -
0.2520 256 0.799 - - -
0.2530 257 0.9882 - - -
0.2539 258 1.1814 - - -
0.2549 259 0.6394 - - -
0.2559 260 1.4756 - - -
0.2569 261 0.5338 - - -
0.2579 262 0.9779 - - -
0.2589 263 1.5307 - - -
0.2598 264 1.1213 - - -
0.2608 265 0.9482 - - -
0.2618 266 0.9599 - - -
0.2628 267 1.4455 - - -
0.2638 268 1.6496 - - -
0.2648 269 0.7402 - - -
0.2657 270 0.7835 - - -
0.2667 271 0.7821 - - -
0.2677 272 1.5422 - - -
0.2687 273 1.0995 - - -
0.2697 274 1.378 - - -
0.2707 275 1.3562 - - -
0.2717 276 0.7376 - - -
0.2726 277 1.1678 - - -
0.2736 278 1.2989 - - -
0.2746 279 1.9559 - - -
0.2756 280 1.1237 - - -
0.2766 281 0.952 - - -
0.2776 282 1.6629 - - -
0.2785 283 1.871 - - -
0.2795 284 1.5946 - - -
0.2805 285 1.4456 - - -
0.2815 286 1.4085 - - -
0.2825 287 1.1394 - - -
0.2835 288 1.0315 - - -
0.2844 289 1.488 - - -
0.2854 290 1.4006 - - -
0.2864 291 0.9237 - - -
0.2874 292 1.163 - - -
0.2884 293 1.7037 - - -
0.2894 294 0.8715 - - -
0.2904 295 1.2101 - - -
0.2913 296 1.1179 - - -
0.2923 297 1.3986 - - -
0.2933 298 1.7068 - - -
0.2943 299 0.8695 - - -
0.2953 300 1.3778 - - -
0.2963 301 1.2834 - - -
0.2972 302 0.8123 - - -
0.2982 303 1.6521 - - -
0.2992 304 1.1064 - - -
0.3002 305 0.9578 - - -
0.3012 306 0.9254 0.4888 0.4789 0.7040
0.3022 307 0.7541 - - -
0.3031 308 0.7324 - - -
0.3041 309 0.5974 - - -
0.3051 310 1.1481 - - -
0.3061 311 1.6179 - - -
0.3071 312 1.4641 - - -
0.3081 313 1.7185 - - -
0.3091 314 0.9328 - - -
0.3100 315 0.742 - - -
0.3110 316 1.4173 - - -
0.3120 317 0.7267 - - -
0.3130 318 0.9494 - - -
0.3140 319 1.5111 - - -
0.3150 320 1.6949 - - -
0.3159 321 1.7562 - - -
0.3169 322 1.2532 - - -
0.3179 323 1.1086 - - -
0.3189 324 0.7377 - - -
0.3199 325 1.085 - - -
0.3209 326 0.7767 - - -
0.3219 327 1.4441 - - -
0.3228 328 0.8146 - - -
0.3238 329 0.7403 - - -
0.3248 330 0.8476 - - -
0.3258 331 0.7323 - - -
0.3268 332 1.2241 - - -
0.3278 333 1.5065 - - -
0.3287 334 0.5259 - - -
0.3297 335 1.3103 - - -
0.3307 336 0.8655 - - -
0.3317 337 0.7575 - - -
0.3327 338 1.968 - - -
0.3337 339 1.317 - - -
0.3346 340 1.1972 - - -
0.3356 341 1.6323 - - -
0.3366 342 1.0469 - - -
0.3376 343 1.3349 - - -
0.3386 344 0.9544 - - -
0.3396 345 1.1894 - - -
0.3406 346 0.7717 - - -
0.3415 347 1.2563 - - -
0.3425 348 1.2437 - - -
0.3435 349 0.7806 - - -
0.3445 350 0.8303 - - -
0.3455 351 1.0926 - - -
0.3465 352 0.6654 - - -
0.3474 353 1.1087 - - -
0.3484 354 1.1525 - - -
0.3494 355 1.1127 - - -
0.3504 356 1.4267 - - -
0.3514 357 0.6148 - - -
0.3524 358 1.0123 - - -
0.3533 359 1.9682 - - -
0.3543 360 0.8487 - - -
0.3553 361 1.0412 - - -
0.3563 362 1.0902 - - -
0.3573 363 0.9606 - - -
0.3583 364 0.9206 - - -
0.3593 365 1.4727 - - -
0.3602 366 0.9379 - - -
0.3612 367 0.8387 - - -
0.3622 368 0.9692 - - -
0.3632 369 1.6298 - - -
0.3642 370 1.0882 - - -
0.3652 371 1.1558 - - -
0.3661 372 0.9546 - - -
0.3671 373 1.0124 - - -
0.3681 374 1.3916 - - -
0.3691 375 0.527 - - -
0.3701 376 0.6387 - - -
0.3711 377 1.1445 - - -
0.3720 378 1.3309 - - -
0.3730 379 1.5888 - - -
0.3740 380 1.4422 - - -
0.375 381 1.7044 - - -
0.3760 382 0.7913 - - -
0.3770 383 1.3241 - - -
0.3780 384 0.6473 - - -
0.3789 385 1.221 - - -
0.3799 386 0.7773 - - -
0.3809 387 1.054 - - -
0.3819 388 0.9862 - - -
0.3829 389 0.9684 - - -
0.3839 390 1.3244 - - -
0.3848 391 1.1787 - - -
0.3858 392 1.4698 - - -
0.3868 393 1.0961 - - -
0.3878 394 1.1364 - - -
0.3888 395 0.9368 - - -
0.3898 396 1.1731 - - -
0.3907 397 0.8686 - - -
0.3917 398 0.7481 - - -
0.3927 399 0.7261 - - -
0.3937 400 1.2062 - - -
0.3947 401 0.7462 - - -
0.3957 402 1.0318 - - -
0.3967 403 1.105 - - -
0.3976 404 1.009 - - -
0.3986 405 0.5941 - - -
0.3996 406 1.7972 - - -
0.4006 407 1.0544 - - -
0.4016 408 1.3912 - - -
0.4026 409 0.8305 - - -
0.4035 410 0.8688 - - -
0.4045 411 1.0069 - - -
0.4055 412 1.3141 - - -
0.4065 413 1.1042 - - -
0.4075 414 1.1011 - - -
0.4085 415 1.1192 - - -
0.4094 416 1.5957 - - -
0.4104 417 1.164 - - -
0.4114 418 0.6425 - - -
0.4124 419 0.6068 - - -
0.4134 420 0.9275 - - -
0.4144 421 0.8836 - - -
0.4154 422 1.2115 - - -
0.4163 423 0.8367 - - -
0.4173 424 1.0595 - - -
0.4183 425 0.826 - - -
0.4193 426 0.707 - - -
0.4203 427 0.6235 - - -
0.4213 428 0.7719 - - -
0.4222 429 1.0862 - - -
0.4232 430 0.9311 - - -
0.4242 431 1.2339 - - -
0.4252 432 0.9891 - - -
0.4262 433 1.8443 - - -
0.4272 434 1.1799 - - -
0.4281 435 0.759 - - -
0.4291 436 1.1002 - - -
0.4301 437 0.9141 - - -
0.4311 438 0.5467 - - -
0.4321 439 0.7476 - - -
0.4331 440 1.14 - - -
0.4341 441 1.1504 - - -
0.4350 442 1.26 - - -
0.4360 443 1.0311 - - -
0.4370 444 1.0646 - - -
0.4380 445 0.8687 - - -
0.4390 446 0.6839 - - -
0.4400 447 1.1376 - - -
0.4409 448 0.9759 - - -
0.4419 449 0.7971 - - -
0.4429 450 0.9708 - - -
0.4439 451 0.8217 - - -
0.4449 452 1.3728 - - -
0.4459 453 0.9119 - - -
0.4469 454 1.012 - - -
0.4478 455 1.3738 - - -
0.4488 456 0.8219 - - -
0.4498 457 1.2558 - - -
0.4508 458 0.6247 - - -
0.4518 459 0.7295 0.5410 0.4920 0.6879
0.4528 460 0.8154 - - -
0.4537 461 1.1392 - - -
0.4547 462 0.8618 - - -
0.4557 463 0.9669 - - -
0.4567 464 0.8804 - - -
0.4577 465 0.8479 - - -
0.4587 466 0.6296 - - -
0.4596 467 0.8449 - - -
0.4606 468 0.9772 - - -
0.4616 469 0.6424 - - -
0.4626 470 0.9169 - - -
0.4636 471 0.7599 - - -
0.4646 472 0.8943 - - -
0.4656 473 0.9475 - - -
0.4665 474 1.4518 - - -
0.4675 475 1.274 - - -
0.4685 476 0.7306 - - -
0.4695 477 0.9238 - - -
0.4705 478 0.6593 - - -
0.4715 479 1.0183 - - -
0.4724 480 1.2577 - - -
0.4734 481 0.8738 - - -
0.4744 482 1.1416 - - -
0.4754 483 0.7135 - - -
0.4764 484 1.2587 - - -
0.4774 485 0.8823 - - -
0.4783 486 0.8423 - - -
0.4793 487 0.7704 - - -
0.4803 488 0.7049 - - -
0.4813 489 1.1893 - - -
0.4823 490 1.3985 - - -
0.4833 491 1.3567 - - -
0.4843 492 1.2573 - - -
0.4852 493 0.7671 - - -
0.4862 494 0.5425 - - -
0.4872 495 0.9372 - - -
0.4882 496 0.799 - - -
0.4892 497 0.9548 - - -
0.4902 498 1.0855 - - -
0.4911 499 1.0465 - - -
0.4921 500 1.1004 - - -
0.4931 501 0.6392 - - -
0.4941 502 0.7102 - - -
0.4951 503 1.3242 - - -
0.4961 504 0.6861 - - -
0.4970 505 0.9291 - - -
0.4980 506 0.8592 - - -
0.4990 507 0.9462 - - -
0.5 508 1.0167 - - -
0.5010 509 1.0118 - - -
0.5020 510 0.6741 - - -
0.5030 511 1.4578 - - -
0.5039 512 1.2959 - - -
0.5049 513 0.8533 - - -
0.5059 514 0.6685 - - -
0.5069 515 1.1556 - - -
0.5079 516 0.8177 - - -
0.5089 517 0.6296 - - -
0.5098 518 0.8407 - - -
0.5108 519 0.6987 - - -
0.5118 520 0.9888 - - -
0.5128 521 0.8938 - - -
0.5138 522 0.582 - - -
0.5148 523 0.6596 - - -
0.5157 524 0.6029 - - -
0.5167 525 0.9806 - - -
0.5177 526 0.9463 - - -
0.5187 527 0.7088 - - -
0.5197 528 0.7525 - - -
0.5207 529 0.7625 - - -
0.5217 530 0.8271 - - -
0.5226 531 0.6129 - - -
0.5236 532 1.1563 - - -
0.5246 533 0.8131 - - -
0.5256 534 0.5363 - - -
0.5266 535 0.8819 - - -
0.5276 536 0.9772 - - -
0.5285 537 1.2102 - - -
0.5295 538 1.1234 - - -
0.5305 539 1.1857 - - -
0.5315 540 0.7873 - - -
0.5325 541 0.5034 - - -
0.5335 542 1.3305 - - -
0.5344 543 1.1727 - - -
0.5354 544 1.2825 - - -
0.5364 545 1.0446 - - -
0.5374 546 0.9838 - - -
0.5384 547 1.2194 - - -
0.5394 548 0.7709 - - -
0.5404 549 0.748 - - -
0.5413 550 1.0948 - - -
0.5423 551 0.915 - - -
0.5433 552 1.537 - - -
0.5443 553 0.3239 - - -
0.5453 554 0.9592 - - -
0.5463 555 0.7737 - - -
0.5472 556 0.613 - - -
0.5482 557 1.3646 - - -
0.5492 558 0.6659 - - -
0.5502 559 0.5207 - - -
0.5512 560 0.9467 - - -
0.5522 561 0.5692 - - -
0.5531 562 1.5855 - - -
0.5541 563 0.8855 - - -
0.5551 564 1.1829 - - -
0.5561 565 0.978 - - -
0.5571 566 1.1818 - - -
0.5581 567 0.701 - - -
0.5591 568 1.0226 - - -
0.5600 569 0.5937 - - -
0.5610 570 0.8095 - - -
0.5620 571 1.174 - - -
0.5630 572 0.96 - - -
0.5640 573 0.8339 - - -
0.5650 574 0.717 - - -
0.5659 575 0.5938 - - -
0.5669 576 0.6501 - - -
0.5679 577 0.7003 - - -
0.5689 578 0.5525 - - -
0.5699 579 0.7003 - - -
0.5709 580 1.059 - - -
0.5719 581 0.8625 - - -
0.5728 582 0.5862 - - -
0.5738 583 0.9162 - - -
0.5748 584 0.926 - - -
0.5758 585 1.2729 - - -
0.5768 586 0.8935 - - -
0.5778 587 0.541 - - -
0.5787 588 1.1455 - - -
0.5797 589 0.7306 - - -
0.5807 590 0.9088 - - -
0.5817 591 0.9166 - - -
0.5827 592 0.8679 - - -
0.5837 593 0.9329 - - -
0.5846 594 1.1201 - - -
0.5856 595 0.6418 - - -
0.5866 596 1.145 - - -
0.5876 597 1.4041 - - -
0.5886 598 0.6954 - - -
0.5896 599 0.4567 - - -
0.5906 600 1.1305 - - -
0.5915 601 0.8077 - - -
0.5925 602 0.6143 - - -
0.5935 603 1.3139 - - -
0.5945 604 0.7694 - - -
0.5955 605 0.9622 - - -
0.5965 606 0.91 - - -
0.5974 607 1.3125 - - -
0.5984 608 1.0153 - - -
0.5994 609 0.8468 - - -
0.6004 610 1.1026 - - -
0.6014 611 0.8291 - - -
0.6024 612 0.7235 0.5680 0.5175 0.6900
0.6033 613 0.9613 - - -
0.6043 614 0.7124 - - -
0.6053 615 1.0719 - - -
0.6063 616 0.7233 - - -
0.6073 617 1.6863 - - -
0.6083 618 0.8665 - - -
0.6093 619 1.6432 - - -
0.6102 620 0.771 - - -
0.6112 621 0.6755 - - -
0.6122 622 0.6809 - - -
0.6132 623 0.5626 - - -
0.6142 624 0.6287 - - -
0.6152 625 0.5478 - - -
0.6161 626 0.9155 - - -
0.6171 627 1.8947 - - -
0.6181 628 1.1943 - - -
0.6191 629 1.2465 - - -
0.6201 630 0.4045 - - -
0.6211 631 0.5688 - - -
0.6220 632 0.8682 - - -
0.6230 633 0.901 - - -
0.6240 634 0.8978 - - -
0.625 635 0.6592 - - -
0.6260 636 0.6828 - - -
0.6270 637 0.3995 - - -
0.6280 638 0.8283 - - -
0.6289 639 0.5938 - - -
0.6299 640 1.256 - - -
0.6309 641 0.7801 - - -
0.6319 642 0.8975 - - -
0.6329 643 1.255 - - -
0.6339 644 1.1252 - - -
0.6348 645 1.2394 - - -
0.6358 646 0.9579 - - -
0.6368 647 0.8664 - - -
0.6378 648 0.4087 - - -
0.6388 649 0.4084 - - -
0.6398 650 0.6214 - - -
0.6407 651 0.7237 - - -
0.6417 652 0.6294 - - -
0.6427 653 0.8264 - - -
0.6437 654 0.8258 - - -
0.6447 655 0.6865 - - -
0.6457 656 0.7408 - - -
0.6467 657 0.6196 - - -
0.6476 658 0.5659 - - -
0.6486 659 0.6353 - - -
0.6496 660 0.8432 - - -
0.6506 661 0.6573 - - -
0.6516 662 0.4918 - - -
0.6526 663 0.8571 - - -
0.6535 664 0.9105 - - -
0.6545 665 0.7499 - - -
0.6555 666 0.5277 - - -
0.6565 667 1.3046 - - -
0.6575 668 0.6112 - - -
0.6585 669 0.754 - - -
0.6594 670 1.3611 - - -
0.6604 671 1.3654 - - -
0.6614 672 0.6479 - - -
0.6624 673 0.9422 - - -
0.6634 674 0.5007 - - -
0.6644 675 0.4789 - - -
0.6654 676 0.3962 - - -
0.6663 677 0.8648 - - -
0.6673 678 0.8241 - - -
0.6683 679 0.7923 - - -
0.6693 680 0.5799 - - -
0.6703 681 0.8422 - - -
0.6713 682 0.5638 - - -
0.6722 683 0.9627 - - -
0.6732 684 0.7585 - - -
0.6742 685 0.488 - - -
0.6752 686 1.0991 - - -
0.6762 687 0.4808 - - -
0.6772 688 1.3717 - - -
0.6781 689 0.8225 - - -
0.6791 690 0.9377 - - -
0.6801 691 0.8016 - - -
0.6811 692 0.6257 - - -
0.6821 693 1.0941 - - -
0.6831 694 1.0878 - - -
0.6841 695 0.9231 - - -
0.6850 696 1.3284 - - -
0.6860 697 0.7274 - - -
0.6870 698 0.7851 - - -
0.6880 699 0.6807 - - -
0.6890 700 1.1592 - - -
0.6900 701 0.6347 - - -
0.6909 702 1.1326 - - -
0.6919 703 1.3797 - - -
0.6929 704 0.7782 - - -
0.6939 705 1.1746 - - -
0.6949 706 0.7624 - - -
0.6959 707 0.7216 - - -
0.6969 708 0.4666 - - -
0.6978 709 0.9398 - - -
0.6988 710 0.9532 - - -
0.6998 711 0.8285 - - -
0.7008 712 0.7042 - - -
0.7018 713 1.0057 - - -
0.7028 714 0.6696 - - -
0.7037 715 0.5701 - - -
0.7047 716 0.7894 - - -
0.7057 717 0.9327 - - -
0.7067 718 1.0283 - - -
0.7077 719 0.6419 - - -
0.7087 720 0.7716 - - -
0.7096 721 0.5968 - - -
0.7106 722 0.6547 - - -
0.7116 723 0.8344 - - -
0.7126 724 0.5448 - - -
0.7136 725 0.6075 - - -
0.7146 726 0.6142 - - -
0.7156 727 0.681 - - -
0.7165 728 0.6137 - - -
0.7175 729 1.4516 - - -
0.7185 730 0.9083 - - -
0.7195 731 0.6369 - - -
0.7205 732 0.8102 - - -
0.7215 733 0.7808 - - -
0.7224 734 0.512 - - -
0.7234 735 1.3589 - - -
0.7244 736 0.6555 - - -
0.7254 737 0.8732 - - -
0.7264 738 0.7591 - - -
0.7274 739 0.8237 - - -
0.7283 740 1.1272 - - -
0.7293 741 0.5353 - - -
0.7303 742 1.1713 - - -
0.7313 743 0.7034 - - -
0.7323 744 0.5118 - - -
0.7333 745 0.7816 - - -
0.7343 746 0.7935 - - -
0.7352 747 1.0736 - - -
0.7362 748 0.6528 - - -
0.7372 749 0.5556 - - -
0.7382 750 0.3339 - - -
0.7392 751 1.0371 - - -
0.7402 752 1.3273 - - -
0.7411 753 0.5307 - - -
0.7421 754 0.908 - - -
0.7431 755 0.5248 - - -
0.7441 756 0.6128 - - -
0.7451 757 1.5065 - - -
0.7461 758 1.2071 - - -
0.7470 759 0.9983 - - -
0.7480 760 1.0378 - - -
0.7490 761 1.0732 - - -
0.75 762 0.8046 - - -
0.7510 763 1.1666 - - -
0.7520 764 0.7042 - - -
0.7530 765 1.4759 0.5730 0.5105 0.6982
0.7539 766 0.5595 - - -
0.7549 767 0.3646 - - -
0.7559 768 0.6791 - - -
0.7569 769 0.6889 - - -
0.7579 770 0.9715 - - -
0.7589 771 1.2218 - - -
0.7598 772 0.3957 - - -
0.7608 773 1.5635 - - -
0.7618 774 0.9081 - - -
0.7628 775 0.8633 - - -
0.7638 776 0.9482 - - -
0.7648 777 0.8021 - - -
0.7657 778 0.9083 - - -
0.7667 779 0.9599 - - -
0.7677 780 0.798 - - -
0.7687 781 0.5817 - - -
0.7697 782 0.5287 - - -
0.7707 783 1.0251 - - -
0.7717 784 0.3995 - - -
0.7726 785 0.8753 - - -
0.7736 786 1.315 - - -
0.7746 787 0.8053 - - -
0.7756 788 0.6032 - - -
0.7766 789 0.8059 - - -
0.7776 790 0.9387 - - -
0.7785 791 0.8321 - - -
0.7795 792 0.5788 - - -
0.7805 793 0.7874 - - -
0.7815 794 0.5145 - - -
0.7825 795 1.5847 - - -
0.7835 796 1.1473 - - -
0.7844 797 0.968 - - -
0.7854 798 1.2691 - - -
0.7864 799 0.7046 - - -
0.7874 800 0.9358 - - -
0.7884 801 0.3945 - - -
0.7894 802 0.861 - - -
0.7904 803 1.0736 - - -
0.7913 804 1.0279 - - -
0.7923 805 0.6902 - - -
0.7933 806 0.7344 - - -
0.7943 807 0.8184 - - -
0.7953 808 0.8121 - - -
0.7963 809 0.3772 - - -
0.7972 810 0.2367 - - -
0.7982 811 0.5584 - - -
0.7992 812 0.3465 - - -
0.8002 813 0.9229 - - -
0.8012 814 0.7351 - - -
0.8022 815 0.9024 - - -
0.8031 816 0.4958 - - -
0.8041 817 0.4321 - - -
0.8051 818 0.6945 - - -
0.8061 819 0.6415 - - -
0.8071 820 0.5615 - - -
0.8081 821 0.9732 - - -
0.8091 822 1.018 - - -
0.8100 823 1.1392 - - -
0.8110 824 0.8539 - - -
0.8120 825 0.7081 - - -
0.8130 826 1.3176 - - -
0.8140 827 0.7387 - - -
0.8150 828 1.0975 - - -
0.8159 829 0.5883 - - -
0.8169 830 0.9677 - - -
0.8179 831 0.3869 - - -
0.8189 832 0.9083 - - -
0.8199 833 0.6939 - - -
0.8209 834 0.5046 - - -
0.8219 835 0.3594 - - -
0.8228 836 0.4609 - - -
0.8238 837 0.8132 - - -
0.8248 838 0.4493 - - -
0.8258 839 0.8566 - - -
0.8268 840 0.9958 - - -
0.8278 841 0.936 - - -
0.8287 842 1.4087 - - -
0.8297 843 1.2214 - - -
0.8307 844 0.7135 - - -
0.8317 845 1.1881 - - -
0.8327 846 1.1807 - - -
0.8337 847 0.5896 - - -
0.8346 848 0.8483 - - -
0.8356 849 0.5409 - - -
0.8366 850 0.8286 - - -
0.8376 851 1.136 - - -
0.8386 852 0.819 - - -
0.8396 853 0.3029 - - -
0.8406 854 0.5538 - - -
0.8415 855 0.4572 - - -
0.8425 856 1.1516 - - -
0.8435 857 0.4768 - - -
0.8445 858 0.5357 - - -
0.8455 859 0.8879 - - -
0.8465 860 0.9121 - - -
0.8474 861 1.052 - - -
0.8484 862 0.8009 - - -
0.8494 863 0.2474 - - -
0.8504 864 0.8517 - - -
0.8514 865 0.5678 - - -
0.8524 866 0.4882 - - -
0.8533 867 0.638 - - -
0.8543 868 0.8012 - - -
0.8553 869 0.4172 - - -
0.8563 870 0.6727 - - -
0.8573 871 0.6383 - - -
0.8583 872 1.0728 - - -
0.8593 873 1.3336 - - -
0.8602 874 1.217 - - -
0.8612 875 0.8018 - - -
0.8622 876 1.2326 - - -
0.8632 877 1.0898 - - -
0.8642 878 0.7192 - - -
0.8652 879 0.7779 - - -
0.8661 880 0.9291 - - -
0.8671 881 0.6333 - - -
0.8681 882 0.9028 - - -
0.8691 883 0.9904 - - -
0.8701 884 0.5775 - - -
0.8711 885 0.9783 - - -
0.8720 886 1.111 - - -
0.8730 887 1.0398 - - -
0.8740 888 0.742 - - -
0.875 889 0.7606 - - -
0.8760 890 0.6941 - - -
0.8770 891 1.0219 - - -
0.8780 892 0.8949 - - -
0.8789 893 1.0013 - - -
0.8799 894 1.1122 - - -
0.8809 895 1.1711 - - -
0.8819 896 0.7457 - - -
0.8829 897 0.5431 - - -
0.8839 898 0.8159 - - -
0.8848 899 1.6812 - - -
0.8858 900 0.8163 - - -
0.8868 901 0.9315 - - -
0.8878 902 0.883 - - -
0.8888 903 0.8014 - - -
0.8898 904 1.4277 - - -
0.8907 905 0.7507 - - -
0.8917 906 0.9757 - - -
0.8927 907 0.6366 - - -
0.8937 908 0.7567 - - -
0.8947 909 0.7853 - - -
0.8957 910 0.7611 - - -
0.8967 911 0.8417 - - -
0.8976 912 0.5825 - - -
0.8986 913 1.0816 - - -
0.8996 914 1.1768 - - -
0.9006 915 0.4894 - - -
0.9016 916 0.8739 - - -
0.9026 917 1.0047 - - -
0.9035 918 0.7733 0.6115 0.5123 0.7069
0.9045 919 1.1274 - - -
0.9055 920 1.5414 - - -
0.9065 921 1.0839 - - -
0.9075 922 0.5004 - - -
0.9085 923 0.8282 - - -
0.9094 924 0.8041 - - -
0.9104 925 0.6129 - - -
0.9114 926 0.8697 - - -
0.9124 927 0.7138 - - -
0.9134 928 1.0387 - - -
0.9144 929 0.6024 - - -
0.9154 930 0.6641 - - -
0.9163 931 0.8028 - - -
0.9173 932 0.8097 - - -
0.9183 933 0.6162 - - -
0.9193 934 0.4264 - - -
0.9203 935 0.3983 - - -
0.9213 936 0.7519 - - -
0.9222 937 1.0682 - - -
0.9232 938 0.6805 - - -
0.9242 939 1.1305 - - -
0.9252 940 1.5805 - - -
0.9262 941 0.7031 - - -
0.9272 942 0.6847 - - -
0.9281 943 0.3777 - - -
0.9291 944 0.7438 - - -
0.9301 945 1.1754 - - -
0.9311 946 0.6602 - - -
0.9321 947 1.2893 - - -
0.9331 948 0.8315 - - -
0.9341 949 1.0961 - - -
0.9350 950 0.7657 - - -
0.9360 951 0.8915 - - -
0.9370 952 0.6784 - - -
0.9380 953 0.6772 - - -
0.9390 954 0.9102 - - -
0.9400 955 0.8083 - - -
0.9409 956 0.5966 - - -
0.9419 957 1.06 - - -
0.9429 958 1.0412 - - -
0.9439 959 0.9467 - - -
0.9449 960 0.847 - - -
0.9459 961 0.875 - - -
0.9469 962 0.8303 - - -
0.9478 963 0.8641 - - -
0.9488 964 0.6343 - - -
0.9498 965 0.7566 - - -
0.9508 966 0.9767 - - -
0.9518 967 0.7518 - - -
0.9528 968 0.5381 - - -
0.9537 969 0.9959 - - -
0.9547 970 0.8374 - - -
0.9557 971 0.5459 - - -
0.9567 972 0.6633 - - -
0.9577 973 0.9526 - - -
0.9587 974 0.8487 - - -
0.9596 975 0.6572 - - -
0.9606 976 0.8039 - - -
0.9616 977 0.6213 - - -
0.9626 978 0.5483 - - -
0.9636 979 0.516 - - -
0.9646 980 0.8891 - - -
0.9656 981 0.7904 - - -
0.9665 982 1.2282 - - -
0.9675 983 1.0609 - - -
0.9685 984 0.8063 - - -
0.9695 985 0.5294 - - -
0.9705 986 0.7394 - - -
0.9715 987 0.5749 - - -
0.9724 988 0.9125 - - -
0.9734 989 0.961 - - -
0.9744 990 0.5947 - - -
0.9754 991 0.6246 - - -
0.9764 992 0.6492 - - -
0.9774 993 1.0508 - - -
0.9783 994 0.7613 - - -
0.9793 995 0.706 - - -
0.9803 996 0.9458 - - -
0.9813 997 0.836 - - -
0.9823 998 0.7472 - - -
0.9833 999 0.9071 - - -
0.9843 1000 1.0935 - - -
0.9852 1001 0.5822 - - -
0.9862 1002 0.6596 - - -
0.9872 1003 0.5536 - - -
0.9882 1004 1.0741 - - -
0.9892 1005 0.4586 - - -
0.9902 1006 0.9633 - - -
0.9911 1007 0.8739 - - -
0.9921 1008 0.665 - - -
0.9931 1009 0.9486 - - -
0.9941 1010 0.5323 - - -
0.9951 1011 0.6006 - - -
0.9961 1012 0.4568 - - -
0.9970 1013 0.7544 - - -
0.9980 1014 1.2437 - - -
0.9990 1015 0.8985 - - -
1.0 1016 0.3179 - - -
1.0010 1017 1.2519 - - -
1.0020 1018 0.9775 - - -
1.0030 1019 0.8967 - - -
1.0039 1020 0.5918 - - -
1.0049 1021 0.5798 - - -
1.0059 1022 0.5934 - - -
1.0069 1023 0.6872 - - -
1.0079 1024 0.9535 - - -
1.0089 1025 0.4967 - - -
1.0098 1026 1.116 - - -
1.0108 1027 1.0446 - - -
1.0118 1028 0.5189 - - -
1.0128 1029 0.8272 - - -
1.0138 1030 0.4901 - - -
1.0148 1031 0.8926 - - -
1.0157 1032 0.5543 - - -
1.0167 1033 0.2579 - - -
1.0177 1034 0.865 - - -
1.0187 1035 1.0559 - - -
1.0197 1036 0.714 - - -
1.0207 1037 0.5185 - - -
1.0217 1038 1.3209 - - -
1.0226 1039 0.3696 - - -
1.0236 1040 0.4839 - - -
1.0246 1041 0.7694 - - -
1.0256 1042 1.3417 - - -
1.0266 1043 0.8292 - - -
1.0276 1044 0.3835 - - -
1.0285 1045 0.3755 - - -
1.0295 1046 0.6396 - - -
1.0305 1047 0.364 - - -
1.0315 1048 0.9232 - - -
1.0325 1049 0.6706 - - -
1.0335 1050 0.8329 - - -
1.0344 1051 0.4093 - - -
1.0354 1052 0.7251 - - -
1.0364 1053 0.9254 - - -
1.0374 1054 0.6365 - - -
1.0384 1055 0.7136 - - -
1.0394 1056 0.6278 - - -
1.0404 1057 0.6769 - - -
1.0413 1058 0.7704 - - -
1.0423 1059 0.7701 - - -
1.0433 1060 0.9639 - - -
1.0443 1061 0.7518 - - -
1.0453 1062 0.7474 - - -
1.0463 1063 1.2196 - - -
1.0472 1064 1.1476 - - -
1.0482 1065 0.7139 - - -
1.0492 1066 1.0979 - - -
1.0502 1067 0.7113 - - -
1.0512 1068 0.6796 - - -
1.0522 1069 1.1551 - - -
1.0531 1070 0.7947 - - -
1.0541 1071 0.6783 0.6381 0.4883 0.7054
1.0551 1072 0.7295 - - -
1.0561 1073 0.4529 - - -
1.0571 1074 0.2625 - - -
1.0581 1075 1.2557 - - -
1.0591 1076 0.4826 - - -
1.0600 1077 1.2309 - - -
1.0610 1078 0.4545 - - -
1.0620 1079 0.6658 - - -
1.0630 1080 0.8651 - - -
1.0640 1081 1.0702 - - -
1.0650 1082 1.0225 - - -
1.0659 1083 0.6952 - - -
1.0669 1084 1.0918 - - -
1.0679 1085 0.5222 - - -
1.0689 1086 1.1311 - - -
1.0699 1087 0.6464 - - -
1.0709 1088 0.801 - - -
1.0719 1089 0.3743 - - -
1.0728 1090 0.6204 - - -
1.0738 1091 0.5369 - - -
1.0748 1092 0.6895 - - -
1.0758 1093 1.2127 - - -
1.0768 1094 0.8397 - - -
1.0778 1095 0.3694 - - -
1.0787 1096 0.7576 - - -
1.0797 1097 0.9494 - - -
1.0807 1098 0.8477 - - -
1.0817 1099 0.4869 - - -
1.0827 1100 0.6753 - - -
1.0837 1101 0.7351 - - -
1.0846 1102 0.8257 - - -
1.0856 1103 1.144 - - -
1.0866 1104 0.5526 - - -
1.0876 1105 0.6686 - - -
1.0886 1106 0.7683 - - -
1.0896 1107 0.3337 - - -
1.0906 1108 0.8686 - - -
1.0915 1109 0.66 - - -
1.0925 1110 0.9329 - - -
1.0935 1111 0.5129 - - -
1.0945 1112 0.8907 - - -
1.0955 1113 0.6305 - - -
1.0965 1114 0.3746 - - -
1.0974 1115 0.6791 - - -
1.0984 1116 1.4172 - - -
1.0994 1117 0.5184 - - -
1.1004 1118 0.4803 - - -
1.1014 1119 0.647 - - -
1.1024 1120 0.3867 - - -
1.1033 1121 0.4965 - - -
1.1043 1122 0.9508 - - -
1.1053 1123 0.709 - - -
1.1063 1124 0.9625 - - -
1.1073 1125 0.7724 - - -
1.1083 1126 0.4318 - - -
1.1093 1127 0.6653 - - -
1.1102 1128 0.781 - - -
1.1112 1129 0.6985 - - -
1.1122 1130 1.2949 - - -
1.1132 1131 0.6845 - - -
1.1142 1132 0.7022 - - -
1.1152 1133 0.8645 - - -
1.1161 1134 0.5331 - - -
1.1171 1135 0.5636 - - -
1.1181 1136 0.4511 - - -
1.1191 1137 0.605 - - -
1.1201 1138 0.6367 - - -
1.1211 1139 1.1988 - - -
1.1220 1140 0.5973 - - -
1.1230 1141 0.8703 - - -
1.1240 1142 0.48 - - -
1.125 1143 0.8903 - - -
1.1260 1144 0.6978 - - -
1.1270 1145 0.8008 - - -
1.1280 1146 0.9583 - - -
1.1289 1147 0.5823 - - -
1.1299 1148 0.974 - - -
1.1309 1149 1.1149 - - -
1.1319 1150 0.5558 - - -
1.1329 1151 0.9106 - - -
1.1339 1152 0.671 - - -
1.1348 1153 0.9424 - - -
1.1358 1154 0.6605 - - -
1.1368 1155 0.536 - - -
1.1378 1156 0.38 - - -
1.1388 1157 0.6255 - - -
1.1398 1158 1.435 - - -
1.1407 1159 0.5903 - - -
1.1417 1160 0.9517 - - -
1.1427 1161 0.4636 - - -
1.1437 1162 0.3721 - - -
1.1447 1163 0.6832 - - -
1.1457 1164 0.5893 - - -
1.1467 1165 0.6226 - - -
1.1476 1166 0.5715 - - -
1.1486 1167 1.0958 - - -
1.1496 1168 0.689 - - -
1.1506 1169 0.6443 - - -
1.1516 1170 0.9278 - - -
1.1526 1171 0.7009 - - -
1.1535 1172 0.6254 - - -
1.1545 1173 0.7537 - - -
1.1555 1174 0.7559 - - -
1.1565 1175 0.8148 - - -
1.1575 1176 0.2994 - - -
1.1585 1177 0.2148 - - -
1.1594 1178 1.0492 - - -
1.1604 1179 0.5191 - - -
1.1614 1180 1.3508 - - -
1.1624 1181 1.0396 - - -
1.1634 1182 0.5003 - - -
1.1644 1183 0.7736 - - -
1.1654 1184 0.3263 - - -
1.1663 1185 1.2059 - - -
1.1673 1186 1.3514 - - -
1.1683 1187 0.9086 - - -
1.1693 1188 0.7245 - - -
1.1703 1189 0.717 - - -
1.1713 1190 0.6741 - - -
1.1722 1191 1.2695 - - -
1.1732 1192 0.406 - - -
1.1742 1193 0.5408 - - -
1.1752 1194 0.4256 - - -
1.1762 1195 0.4827 - - -
1.1772 1196 0.5621 - - -
1.1781 1197 0.6734 - - -
1.1791 1198 0.4685 - - -
1.1801 1199 0.859 - - -
1.1811 1200 1.4388 - - -
1.1821 1201 1.1846 - - -
1.1831 1202 0.6824 - - -
1.1841 1203 1.2064 - - -
1.1850 1204 0.5668 - - -
1.1860 1205 0.5286 - - -
1.1870 1206 0.6316 - - -
1.1880 1207 0.8011 - - -
1.1890 1208 0.66 - - -
1.1900 1209 0.3735 - - -
1.1909 1210 0.7461 - - -
1.1919 1211 0.5508 - - -
1.1929 1212 0.6379 - - -
1.1939 1213 0.5204 - - -
1.1949 1214 0.7522 - - -
1.1959 1215 0.9415 - - -
1.1969 1216 0.6351 - - -
1.1978 1217 0.262 - - -
1.1988 1218 0.9048 - - -
1.1998 1219 0.6562 - - -
1.2008 1220 0.4502 - - -
1.2018 1221 0.7093 - - -
1.2028 1222 0.3464 - - -
1.2037 1223 0.7775 - - -
1.2047 1224 0.8652 0.6533 0.4990 0.6691
1.2057 1225 0.3028 - - -
1.2067 1226 0.5766 - - -
1.2077 1227 0.6506 - - -
1.2087 1228 0.7059 - - -
1.2096 1229 0.6542 - - -
1.2106 1230 0.6426 - - -
1.2116 1231 0.5838 - - -
1.2126 1232 0.198 - - -
1.2136 1233 0.5243 - - -
1.2146 1234 0.9086 - - -
1.2156 1235 0.769 - - -
1.2165 1236 0.5532 - - -
1.2175 1237 0.8527 - - -
1.2185 1238 0.3036 - - -
1.2195 1239 0.5552 - - -
1.2205 1240 0.6249 - - -
1.2215 1241 0.44 - - -
1.2224 1242 0.7128 - - -
1.2234 1243 0.3144 - - -
1.2244 1244 0.6626 - - -
1.2254 1245 1.4887 - - -
1.2264 1246 0.5861 - - -
1.2274 1247 0.5891 - - -
1.2283 1248 0.4475 - - -
1.2293 1249 0.6157 - - -
1.2303 1250 0.7397 - - -
1.2313 1251 0.3894 - - -
1.2323 1252 0.925 - - -
1.2333 1253 0.5181 - - -
1.2343 1254 0.8689 - - -
1.2352 1255 0.6174 - - -
1.2362 1256 0.891 - - -
1.2372 1257 0.2898 - - -
1.2382 1258 0.7739 - - -
1.2392 1259 0.3945 - - -
1.2402 1260 0.7109 - - -
1.2411 1261 0.7379 - - -
1.2421 1262 0.4401 - - -
1.2431 1263 1.2032 - - -
1.2441 1264 0.344 - - -
1.2451 1265 0.3768 - - -
1.2461 1266 0.5961 - - -
1.2470 1267 0.9748 - - -
1.2480 1268 0.3169 - - -
1.2490 1269 1.0254 - - -
1.25 1270 0.2742 - - -
1.2510 1271 0.7498 - - -
1.2520 1272 0.2447 - - -
1.2530 1273 0.424 - - -
1.2539 1274 0.5414 - - -
1.2549 1275 0.5313 - - -
1.2559 1276 0.8376 - - -
1.2569 1277 0.2828 - - -
1.2579 1278 0.3722 - - -
1.2589 1279 1.2595 - - -
1.2598 1280 0.6573 - - -
1.2608 1281 0.6975 - - -
1.2618 1282 0.4378 - - -
1.2628 1283 0.8803 - - -
1.2638 1284 0.7752 - - -
1.2648 1285 0.5218 - - -
1.2657 1286 0.3086 - - -
1.2667 1287 0.6541 - - -
1.2677 1288 1.1188 - - -
1.2687 1289 0.6281 - - -
1.2697 1290 0.9196 - - -
1.2707 1291 0.7273 - - -
1.2717 1292 0.6554 - - -
1.2726 1293 0.5857 - - -
1.2736 1294 0.7422 - - -
1.2746 1295 1.1301 - - -
1.2756 1296 0.8172 - - -
1.2766 1297 0.3944 - - -
1.2776 1298 0.7966 - - -
1.2785 1299 0.9283 - - -
1.2795 1300 0.7327 - - -
1.2805 1301 0.4587 - - -
1.2815 1302 0.5131 - - -
1.2825 1303 0.6072 - - -
1.2835 1304 0.6406 - - -
1.2844 1305 0.9822 - - -
1.2854 1306 0.7562 - - -
1.2864 1307 0.7944 - - -
1.2874 1308 0.9248 - - -
1.2884 1309 1.5515 - - -
1.2894 1310 0.5007 - - -
1.2904 1311 0.6497 - - -
1.2913 1312 0.6082 - - -
1.2923 1313 1.0414 - - -
1.2933 1314 0.9682 - - -
1.2943 1315 0.661 - - -
1.2953 1316 0.7459 - - -
1.2963 1317 0.732 - - -
1.2972 1318 0.2937 - - -
1.2982 1319 0.5876 - - -
1.2992 1320 0.5333 - - -
1.3002 1321 0.3876 - - -
1.3012 1322 0.3384 - - -
1.3022 1323 0.3552 - - -
1.3031 1324 0.3188 - - -
1.3041 1325 0.369 - - -
1.3051 1326 0.6256 - - -
1.3061 1327 1.1555 - - -
1.3071 1328 1.1316 - - -
1.3081 1329 0.7225 - - -
1.3091 1330 0.5129 - - -
1.3100 1331 0.2852 - - -
1.3110 1332 0.8099 - - -
1.3120 1333 0.3671 - - -
1.3130 1334 0.6209 - - -
1.3140 1335 0.583 - - -
1.3150 1336 1.0555 - - -
1.3159 1337 0.9641 - - -
1.3169 1338 0.7796 - - -
1.3179 1339 0.2689 - - -
1.3189 1340 0.3622 - - -
1.3199 1341 0.5872 - - -
1.3209 1342 0.7294 - - -
1.3219 1343 0.659 - - -
1.3228 1344 0.2835 - - -
1.3238 1345 0.3833 - - -
1.3248 1346 0.3645 - - -
1.3258 1347 0.2585 - - -
1.3268 1348 0.4054 - - -
1.3278 1349 0.8016 - - -
1.3287 1350 0.3908 - - -
1.3297 1351 0.626 - - -
1.3307 1352 0.3939 - - -
1.3317 1353 0.5573 - - -
1.3327 1354 0.8803 - - -
1.3337 1355 1.4102 - - -
1.3346 1356 0.7607 - - -
1.3356 1357 0.5631 - - -
1.3366 1358 0.3118 - - -
1.3376 1359 0.7924 - - -
1.3386 1360 0.4304 - - -
1.3396 1361 0.5641 - - -
1.3406 1362 0.3679 - - -
1.3415 1363 0.4429 - - -
1.3425 1364 0.7248 - - -
1.3435 1365 0.4378 - - -
1.3445 1366 0.7427 - - -
1.3455 1367 0.7427 - - -
1.3465 1368 0.2993 - - -
1.3474 1369 0.5475 - - -
1.3484 1370 0.8495 - - -
1.3494 1371 0.6147 - - -
1.3504 1372 0.67 - - -
1.3514 1373 0.3701 - - -
1.3524 1374 0.2513 - - -
1.3533 1375 0.9747 - - -
1.3543 1376 0.5877 - - -
1.3553 1377 0.8615 0.6700 0.4979 0.7109
1.3563 1378 0.6374 - - -
1.3573 1379 0.3782 - - -
1.3583 1380 0.1647 - - -
1.3593 1381 1.1326 - - -
1.3602 1382 0.6177 - - -
1.3612 1383 0.4708 - - -
1.3622 1384 0.75 - - -
1.3632 1385 0.985 - - -
1.3642 1386 0.5775 - - -
1.3652 1387 0.738 - - -
1.3661 1388 0.6335 - - -
1.3671 1389 0.3397 - - -
1.3681 1390 0.5236 - - -
1.3691 1391 0.3185 - - -
1.3701 1392 0.4717 - - -
1.3711 1393 0.6093 - - -
1.3720 1394 0.3521 - - -
1.3730 1395 1.0617 - - -
1.3740 1396 0.5693 - - -
1.375 1397 0.8272 - - -
1.3760 1398 0.3983 - - -
1.3770 1399 1.1205 - - -
1.3780 1400 0.2393 - - -
1.3789 1401 0.465 - - -
1.3799 1402 0.457 - - -
1.3809 1403 0.921 - - -
1.3819 1404 0.4724 - - -
1.3829 1405 0.598 - - -
1.3839 1406 0.8975 - - -
1.3848 1407 0.6914 - - -
1.3858 1408 0.6997 - - -
1.3868 1409 0.6738 - - -
1.3878 1410 0.3945 - - -
1.3888 1411 0.5746 - - -
1.3898 1412 0.3389 - - -
1.3907 1413 0.5674 - - -
1.3917 1414 0.4476 - - -
1.3927 1415 0.4308 - - -
1.3937 1416 1.2597 - - -
1.3947 1417 0.6179 - - -
1.3957 1418 0.4679 - - -
1.3967 1419 0.5284 - - -
1.3976 1420 1.0125 - - -
1.3986 1421 0.4006 - - -
1.3996 1422 0.7757 - - -
1.4006 1423 0.6065 - - -
1.4016 1424 0.5428 - - -
1.4026 1425 0.32 - - -
1.4035 1426 0.5435 - - -
1.4045 1427 0.3498 - - -
1.4055 1428 0.5562 - - -
1.4065 1429 0.7056 - - -
1.4075 1430 0.3809 - - -
1.4085 1431 0.9798 - - -
1.4094 1432 1.0254 - - -
1.4104 1433 0.7483 - - -
1.4114 1434 0.3944 - - -
1.4124 1435 0.5626 - - -
1.4134 1436 0.4553 - - -
1.4144 1437 0.4601 - - -
1.4154 1438 0.734 - - -
1.4163 1439 0.3654 - - -
1.4173 1440 0.8148 - - -
1.4183 1441 0.4998 - - -
1.4193 1442 0.4505 - - -
1.4203 1443 0.4764 - - -
1.4213 1444 0.4486 - - -
1.4222 1445 0.3752 - - -
1.4232 1446 0.428 - - -
1.4242 1447 0.3955 - - -
1.4252 1448 0.6988 - - -
1.4262 1449 0.7233 - - -
1.4272 1450 1.2082 - - -
1.4281 1451 0.4325 - - -
1.4291 1452 0.8255 - - -
1.4301 1453 0.6323 - - -
1.4311 1454 0.2209 - - -
1.4321 1455 0.4348 - - -
1.4331 1456 0.6173 - - -
1.4341 1457 0.5231 - - -
1.4350 1458 0.9232 - - -
1.4360 1459 0.6264 - - -
1.4370 1460 0.5722 - - -
1.4380 1461 0.5074 - - -
1.4390 1462 0.3808 - - -
1.4400 1463 0.5676 - - -
1.4409 1464 0.4172 - - -
1.4419 1465 0.4354 - - -
1.4429 1466 0.6341 - - -
1.4439 1467 0.5613 - - -
1.4449 1468 0.6067 - - -
1.4459 1469 0.5303 - - -
1.4469 1470 0.7982 - - -
1.4478 1471 0.8807 - - -
1.4488 1472 0.3024 - - -
1.4498 1473 0.3424 - - -
1.4508 1474 0.5359 - - -
1.4518 1475 0.315 - - -
1.4528 1476 0.8719 - - -
1.4537 1477 0.6459 - - -
1.4547 1478 0.2313 - - -
1.4557 1479 0.7386 - - -
1.4567 1480 0.7812 - - -
1.4577 1481 0.3368 - - -
1.4587 1482 0.7843 - - -
1.4596 1483 0.3082 - - -
1.4606 1484 0.465 - - -
1.4616 1485 0.2456 - - -
1.4626 1486 0.3764 - - -
1.4636 1487 0.2452 - - -
1.4646 1488 0.3893 - - -
1.4656 1489 0.5935 - - -
1.4665 1490 0.6697 - - -
1.4675 1491 0.4737 - - -
1.4685 1492 0.5339 - - -
1.4695 1493 0.1327 - - -
1.4705 1494 0.7115 - - -
1.4715 1495 1.0257 - - -
1.4724 1496 0.6444 - - -
1.4734 1497 0.2697 - - -
1.4744 1498 0.7672 - - -
1.4754 1499 0.6761 - - -
1.4764 1500 1.3402 - - -
1.4774 1501 0.7029 - - -
1.4783 1502 0.6435 - - -
1.4793 1503 0.5042 - - -
1.4803 1504 0.3626 - - -
1.4813 1505 0.6935 - - -
1.4823 1506 0.7811 - - -
1.4833 1507 0.4106 - - -
1.4843 1508 0.7377 - - -
1.4852 1509 0.4939 - - -
1.4862 1510 0.1617 - - -
1.4872 1511 0.6942 - - -
1.4882 1512 0.7478 - - -
1.4892 1513 0.4271 - - -
1.4902 1514 0.5239 - - -
1.4911 1515 0.8726 - - -
1.4921 1516 0.6608 - - -
1.4931 1517 0.4105 - - -
1.4941 1518 0.5149 - - -
1.4951 1519 0.7481 - - -
1.4961 1520 0.4546 - - -
1.4970 1521 0.5731 - - -
1.4980 1522 0.8964 - - -
1.4990 1523 0.882 - - -
1.5 1524 0.5065 - - -
1.5010 1525 0.6484 - - -
1.5020 1526 0.4722 - - -
1.5030 1527 0.8768 - - -
1.5039 1528 0.8905 - - -
1.5049 1529 0.4441 - - -
1.5059 1530 0.4498 0.6835 0.5259 0.7011
1.5069 1531 0.7657 - - -
1.5079 1532 0.544 - - -
1.5089 1533 0.35 - - -
1.5098 1534 0.3211 - - -
1.5108 1535 0.4795 - - -
1.5118 1536 0.4776 - - -
1.5128 1537 0.4259 - - -
1.5138 1538 0.1436 - - -
1.5148 1539 0.5271 - - -
1.5157 1540 0.4446 - - -
1.5167 1541 0.3741 - - -
1.5177 1542 0.5234 - - -
1.5187 1543 0.3489 - - -
1.5197 1544 0.3911 - - -
1.5207 1545 0.5522 - - -
1.5217 1546 0.2651 - - -
1.5226 1547 0.4473 - - -
1.5236 1548 0.4625 - - -
1.5246 1549 0.6767 - - -
1.5256 1550 0.4444 - - -
1.5266 1551 0.3958 - - -
1.5276 1552 0.3662 - - -
1.5285 1553 0.6872 - - -
1.5295 1554 0.6356 - - -
1.5305 1555 0.5265 - - -
1.5315 1556 0.596 - - -
1.5325 1557 0.1691 - - -
1.5335 1558 1.0575 - - -
1.5344 1559 1.1115 - - -
1.5354 1560 0.7607 - - -
1.5364 1561 0.5262 - - -
1.5374 1562 0.7648 - - -
1.5384 1563 0.5331 - - -
1.5394 1564 0.4568 - - -
1.5404 1565 0.4017 - - -
1.5413 1566 0.891 - - -
1.5423 1567 0.3974 - - -
1.5433 1568 0.8785 - - -
1.5443 1569 0.2573 - - -
1.5453 1570 0.6737 - - -
1.5463 1571 0.3374 - - -
1.5472 1572 0.2978 - - -
1.5482 1573 0.6462 - - -
1.5492 1574 0.1323 - - -
1.5502 1575 0.2862 - - -
1.5512 1576 0.4854 - - -
1.5522 1577 0.4033 - - -
1.5531 1578 0.7233 - - -
1.5541 1579 0.5184 - - -
1.5551 1580 0.9382 - - -
1.5561 1581 0.6416 - - -
1.5571 1582 0.6853 - - -
1.5581 1583 0.604 - - -
1.5591 1584 0.498 - - -
1.5600 1585 0.3023 - - -
1.5610 1586 0.2496 - - -
1.5620 1587 0.8291 - - -
1.5630 1588 0.4926 - - -
1.5640 1589 0.6343 - - -
1.5650 1590 0.569 - - -
1.5659 1591 0.21 - - -
1.5669 1592 0.3462 - - -
1.5679 1593 0.6391 - - -
1.5689 1594 0.2757 - - -
1.5699 1595 0.4453 - - -
1.5709 1596 0.3759 - - -
1.5719 1597 0.5071 - - -
1.5728 1598 0.1958 - - -
1.5738 1599 0.3417 - - -
1.5748 1600 0.4496 - - -
1.5758 1601 0.5615 - - -
1.5768 1602 0.4346 - - -
1.5778 1603 0.3385 - - -
1.5787 1604 0.6684 - - -
1.5797 1605 0.3634 - - -
1.5807 1606 0.6762 - - -
1.5817 1607 0.3477 - - -
1.5827 1608 0.5679 - - -
1.5837 1609 0.7458 - - -
1.5846 1610 0.7871 - - -
1.5856 1611 0.3641 - - -
1.5866 1612 0.6641 - - -
1.5876 1613 1.1934 - - -
1.5886 1614 0.3024 - - -
1.5896 1615 0.3846 - - -
1.5906 1616 0.6288 - - -
1.5915 1617 0.4733 - - -
1.5925 1618 0.2775 - - -
1.5935 1619 0.6975 - - -
1.5945 1620 0.317 - - -
1.5955 1621 0.5374 - - -
1.5965 1622 0.8305 - - -
1.5974 1623 0.56 - - -
1.5984 1624 0.5452 - - -
1.5994 1625 0.602 - - -
1.6004 1626 0.7525 - - -
1.6014 1627 0.3503 - - -
1.6024 1628 0.3675 - - -
1.6033 1629 0.7333 - - -
1.6043 1630 0.4482 - - -
1.6053 1631 0.315 - - -
1.6063 1632 0.2173 - - -
1.6073 1633 1.0407 - - -
1.6083 1634 0.5099 - - -
1.6093 1635 0.8723 - - -
1.6102 1636 0.2828 - - -
1.6112 1637 0.3407 - - -
1.6122 1638 0.2632 - - -
1.6132 1639 0.2201 - - -
1.6142 1640 0.3298 - - -
1.6152 1641 0.2171 - - -
1.6161 1642 0.418 - - -
1.6171 1643 0.8353 - - -
1.6181 1644 0.4935 - - -
1.6191 1645 0.5569 - - -
1.6201 1646 0.2752 - - -
1.6211 1647 0.3603 - - -
1.6220 1648 0.4289 - - -
1.6230 1649 0.6225 - - -
1.6240 1650 0.3835 - - -
1.625 1651 0.1686 - - -
1.6260 1652 0.5061 - - -
1.6270 1653 0.2085 - - -
1.6280 1654 0.3367 - - -
1.6289 1655 0.5719 - - -
1.6299 1656 1.073 - - -
1.6309 1657 0.1355 - - -
1.6319 1658 0.4165 - - -
1.6329 1659 0.831 - - -
1.6339 1660 0.4592 - - -
1.6348 1661 0.4908 - - -
1.6358 1662 0.7708 - - -
1.6368 1663 0.5747 - - -
1.6378 1664 0.2344 - - -
1.6388 1665 0.4998 - - -
1.6398 1666 0.1832 - - -
1.6407 1667 0.4653 - - -
1.6417 1668 0.2594 - - -
1.6427 1669 0.4378 - - -
1.6437 1670 0.3691 - - -
1.6447 1671 0.2519 - - -
1.6457 1672 0.5589 - - -
1.6467 1673 0.2041 - - -
1.6476 1674 0.3355 - - -
1.6486 1675 0.6531 - - -
1.6496 1676 0.2044 - - -
1.6506 1677 0.4636 - - -
1.6516 1678 0.4883 - - -
1.6526 1679 1.1288 - - -
1.6535 1680 0.5687 - - -
1.6545 1681 0.6162 - - -
1.6555 1682 0.2443 - - -
1.6565 1683 0.9008 0.6929 0.5262 0.7029
1.6575 1684 0.344 - - -
1.6585 1685 0.2586 - - -
1.6594 1686 0.6468 - - -
1.6604 1687 0.8279 - - -
1.6614 1688 0.3112 - - -
1.6624 1689 0.6875 - - -
1.6634 1690 0.231 - - -
1.6644 1691 0.235 - - -
1.6654 1692 0.347 - - -
1.6663 1693 0.5593 - - -
1.6673 1694 0.4908 - - -
1.6683 1695 0.3652 - - -
1.6693 1696 0.3771 - - -
1.6703 1697 0.3547 - - -
1.6713 1698 0.3422 - - -
1.6722 1699 0.5911 - - -
1.6732 1700 0.5544 - - -
1.6742 1701 0.295 - - -
1.6752 1702 0.4211 - - -
1.6762 1703 0.3966 - - -
1.6772 1704 0.7783 - - -
1.6781 1705 0.7513 - - -
1.6791 1706 0.7145 - - -
1.6801 1707 0.8779 - - -
1.6811 1708 0.2505 - - -
1.6821 1709 0.7892 - - -
1.6831 1710 0.4353 - - -
1.6841 1711 0.5166 - - -
1.6850 1712 0.946 - - -
1.6860 1713 0.3011 - - -
1.6870 1714 0.7152 - - -
1.6880 1715 0.1964 - - -
1.6890 1716 0.4533 - - -
1.6900 1717 0.2566 - - -
1.6909 1718 0.5991 - - -
1.6919 1719 0.8419 - - -
1.6929 1720 0.3322 - - -
1.6939 1721 0.9387 - - -
1.6949 1722 0.6228 - - -
1.6959 1723 0.4324 - - -
1.6969 1724 0.1489 - - -
1.6978 1725 0.6311 - - -
1.6988 1726 0.8488 - - -
1.6998 1727 0.2451 - - -
1.7008 1728 0.6744 - - -
1.7018 1729 0.3383 - - -
1.7028 1730 0.3949 - - -
1.7037 1731 0.4743 - - -
1.7047 1732 0.3058 - - -
1.7057 1733 0.5883 - - -
1.7067 1734 0.8702 - - -
1.7077 1735 0.3621 - - -
1.7087 1736 0.7945 - - -
1.7096 1737 0.1308 - - -
1.7106 1738 0.2788 - - -
1.7116 1739 0.7254 - - -
1.7126 1740 0.2491 - - -
1.7136 1741 0.4208 - - -
1.7146 1742 0.4208 - - -
1.7156 1743 0.4741 - - -
1.7165 1744 0.4124 - - -
1.7175 1745 0.5402 - - -
1.7185 1746 0.4638 - - -
1.7195 1747 0.2356 - - -
1.7205 1748 0.4948 - - -
1.7215 1749 0.3384 - - -
1.7224 1750 0.2449 - - -
1.7234 1751 0.6048 - - -
1.7244 1752 0.3199 - - -
1.7254 1753 0.4052 - - -
1.7264 1754 0.2473 - - -
1.7274 1755 0.6658 - - -
1.7283 1756 0.4531 - - -
1.7293 1757 0.6038 - - -
1.7303 1758 0.6887 - - -
1.7313 1759 0.4222 - - -
1.7323 1760 0.068 - - -
1.7333 1761 0.4014 - - -
1.7343 1762 0.2613 - - -
1.7352 1763 0.6287 - - -
1.7362 1764 0.462 - - -
1.7372 1765 0.3476 - - -
1.7382 1766 0.2576 - - -
1.7392 1767 0.8844 - - -
1.7402 1768 0.6081 - - -
1.7411 1769 0.1292 - - -
1.7421 1770 0.6448 - - -
1.7431 1771 0.2812 - - -
1.7441 1772 0.2312 - - -
1.7451 1773 1.1008 - - -
1.7461 1774 0.7085 - - -
1.7470 1775 0.4462 - - -
1.7480 1776 0.528 - - -
1.7490 1777 0.7114 - - -
1.75 1778 0.2463 - - -
1.7510 1779 0.4123 - - -
1.7520 1780 0.383 - - -
1.7530 1781 1.4599 - - -
1.7539 1782 0.2778 - - -
1.7549 1783 0.2371 - - -
1.7559 1784 0.2167 - - -
1.7569 1785 0.3909 - - -
1.7579 1786 0.5294 - - -
1.7589 1787 0.5687 - - -
1.7598 1788 0.1999 - - -
1.7608 1789 0.5501 - - -
1.7618 1790 0.71 - - -
1.7628 1791 0.3877 - - -
1.7638 1792 0.4649 - - -
1.7648 1793 0.719 - - -
1.7657 1794 0.2317 - - -
1.7667 1795 0.7701 - - -
1.7677 1796 0.1972 - - -
1.7687 1797 0.0844 - - -
1.7697 1798 0.2223 - - -
1.7707 1799 0.6739 - - -
1.7717 1800 0.4762 - - -
1.7726 1801 0.4405 - - -
1.7736 1802 0.7204 - - -
1.7746 1803 0.5104 - - -
1.7756 1804 0.2602 - - -
1.7766 1805 0.3529 - - -
1.7776 1806 0.3906 - - -
1.7785 1807 0.4859 - - -
1.7795 1808 0.5454 - - -
1.7805 1809 0.4257 - - -
1.7815 1810 0.4074 - - -
1.7825 1811 0.6397 - - -
1.7835 1812 0.5058 - - -
1.7844 1813 0.6833 - - -
1.7854 1814 0.6613 - - -
1.7864 1815 0.6993 - - -
1.7874 1816 0.4406 - - -
1.7884 1817 0.2914 - - -
1.7894 1818 0.4076 - - -
1.7904 1819 0.8532 - - -
1.7913 1820 0.5586 - - -
1.7923 1821 0.629 - - -
1.7933 1822 0.5267 - - -
1.7943 1823 0.4557 - - -
1.7953 1824 0.2006 - - -
1.7963 1825 0.3406 - - -
1.7972 1826 0.1683 - - -
1.7982 1827 0.2141 - - -
1.7992 1828 0.0945 - - -
1.8002 1829 0.4701 - - -
1.8012 1830 0.4771 - - -

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.2.1
  • Transformers: 4.44.2
  • PyTorch: 2.5.0+cu121
  • Accelerate: 0.34.2
  • Datasets: 3.0.2
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

GISTEmbedLoss

@misc{solatorio2024gistembed,
    title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
    author={Aivin V. Solatorio},
    year={2024},
    eprint={2402.16829},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}