---
base_model: microsoft/deberta-v3-small
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
- cosine_accuracy
- cosine_accuracy_threshold
- cosine_f1
- cosine_f1_threshold
- cosine_precision
- cosine_recall
- cosine_ap
- dot_accuracy
- dot_accuracy_threshold
- dot_f1
- dot_f1_threshold
- dot_precision
- dot_recall
- dot_ap
- manhattan_accuracy
- manhattan_accuracy_threshold
- manhattan_f1
- manhattan_f1_threshold
- manhattan_precision
- manhattan_recall
- manhattan_ap
- euclidean_accuracy
- euclidean_accuracy_threshold
- euclidean_f1
- euclidean_f1_threshold
- euclidean_precision
- euclidean_recall
- euclidean_ap
- max_accuracy
- max_accuracy_threshold
- max_f1
- max_f1_threshold
- max_precision
- max_recall
- max_ap
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:32500
- loss:GISTEmbedLoss
widget:
- source_sentence: Fish hatch into larvae that are different from the adult form of
species.
sentences:
- Fish hatch into larvae that are different from the adult form of?
- amphibians hatch from eggs
- A solenoid or coil wrapped around iron or certain other metals can form a(n) electromagnet.
- source_sentence: About 200 countries and territories have reported coronavirus cases
in 2020 .
sentences:
- All-Time Olympic Games Medal Tally Analysis Home > Events > Olympics > Summer
> Medal Tally > All-Time All-Time Olympic Games Medal Tally (Summer Olympics)
Which country is the most successful at he Olympic Games? Here are the top ranked
countries in terms of total medals won when all of the summer Games are considered
(including the 2016 Rio Games). There are two tables presented, the first just
lists the top countries based on the total medals won, the second table factors
in how many Olympic Games the country appeared, averaging the total number of
medals per Olympiad. A victory in a team sport is counted as one medal. The USA
Has Won the Most Medals The US have clearly won the most gold medals and the most
medals overall, more than doubling the next ranked country (these figures include
medals won in Rio 2016). Second placed USSR had fewer appearances at the Olympics,
and actually won more medals on average (see the 2nd table). The top 10 includes
one country no longer in existence (the Soviet Union), so their medal totals will
obviously not increase, however China is expected to continue a rapid rise up
the ranks. With the addition of the 2016 data, China has moved up from 11th (in
2008) to 9th (2012) to 7th (2016). The country which has attended the most games
without a medal is Monaco (20 Olympic Games), the country which has won the most
medals without winning a gold medal is Malaysia (0 gold, 7 silver, 4 bronze).
rank
- An example of a reproductive behavior is salmon returning to their birthplace
to lay their eggs
- more than 664,000 cases of COVID-19 have been reported in over 190 countries and
territories , resulting in approximately 30,800 deaths .
- source_sentence: The wave on a guitar string is transverse. the sound wave rattles
a sheet of paper in a direction that shows the sound wave is what?
sentences:
- A Honda motorcycle parked in a grass driveway
- In Panama tipping is a question of rewarding good service rather than an obligation.
Restaurant bills don't include gratuities; adding 10% is customary. Bellhops and
maids expect tips only in more expensive hotels, and $1–$2 per bag is the norm.
You should also give a tip of up to $10 per day to tour guides.
- Figure 16.33 The wave on a guitar string is transverse. The sound wave rattles
a sheet of paper in a direction that shows the sound wave is longitudinal.
- source_sentence: The thermal production of a stove is generically used for
sentences:
- In total , 28 US victims were killed , while Viet Cong losses were killed 345
and a further 192 estimated killed .
- a stove generates heat for cooking usually
- A teenager has been charged over an incident in which a four-year-old girl was
hurt when she was hit in the face by a brick thrown through a van window.
- source_sentence: can sweet potatoes cause itching?
sentences:
- 'People with a true potato allergy may react immediately after touching, peeling,
or eating potatoes. Symptoms may vary from person to person, but typical symptoms
of a potato allergy include: rhinitis, including itchy or stinging eyes, a runny
or stuffy nose, and sneezing.'
- riding a bike does not cause pollution
- "Dilation occurs when cell walls relax.. An aneurysm is a dilation, or bubble,\
\ that occurs in the wall of an artery. \n an artery can be relaxed by dilation"
model-index:
- name: SentenceTransformer based on microsoft/deberta-v3-small
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test
type: sts-test
metrics:
- type: pearson_cosine
value: 0.6368197640293093
name: Pearson Cosine
- type: spearman_cosine
value: 0.6345637125214598
name: Spearman Cosine
- type: pearson_manhattan
value: 0.6467215914161899
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.6336825601846632
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.6470519111681319
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.6345603114255941
name: Spearman Euclidean
- type: pearson_dot
value: 0.6367905887877633
name: Pearson Dot
- type: spearman_dot
value: 0.6346041676617576
name: Spearman Dot
- type: pearson_max
value: 0.6470519111681319
name: Pearson Max
- type: spearman_max
value: 0.6346041676617576
name: Spearman Max
- task:
type: binary-classification
name: Binary Classification
dataset:
name: allNLI dev
type: allNLI-dev
metrics:
- type: cosine_accuracy
value: 0.70703125
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.908595085144043
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.5522388059701493
name: Cosine F1
- type: cosine_f1_threshold
value: 0.8629225492477417
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.4847161572052402
name: Cosine Precision
- type: cosine_recall
value: 0.6416184971098265
name: Cosine Recall
- type: cosine_ap
value: 0.5430466488954966
name: Cosine Ap
- type: dot_accuracy
value: 0.708984375
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 698.3015747070312
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.5522388059701493
name: Dot F1
- type: dot_f1_threshold
value: 663.3272705078125
name: Dot F1 Threshold
- type: dot_precision
value: 0.4847161572052402
name: Dot Precision
- type: dot_recall
value: 0.6416184971098265
name: Dot Recall
- type: dot_ap
value: 0.5430436315328802
name: Dot Ap
- type: manhattan_accuracy
value: 0.705078125
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 263.4867858886719
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.5520581113801454
name: Manhattan F1
- type: manhattan_f1_threshold
value: 326.1763000488281
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.475
name: Manhattan Precision
- type: manhattan_recall
value: 0.6589595375722543
name: Manhattan Recall
- type: manhattan_ap
value: 0.5413936285926788
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.70703125
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 11.85396957397461
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.5522388059701493
name: Euclidean F1
- type: euclidean_f1_threshold
value: 14.51696491241455
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.4847161572052402
name: Euclidean Precision
- type: euclidean_recall
value: 0.6416184971098265
name: Euclidean Recall
- type: euclidean_ap
value: 0.5430008642299873
name: Euclidean Ap
- type: max_accuracy
value: 0.708984375
name: Max Accuracy
- type: max_accuracy_threshold
value: 698.3015747070312
name: Max Accuracy Threshold
- type: max_f1
value: 0.5522388059701493
name: Max F1
- type: max_f1_threshold
value: 663.3272705078125
name: Max F1 Threshold
- type: max_precision
value: 0.4847161572052402
name: Max Precision
- type: max_recall
value: 0.6589595375722543
name: Max Recall
- type: max_ap
value: 0.5430466488954966
name: Max Ap
- task:
type: binary-classification
name: Binary Classification
dataset:
name: Qnli dev
type: Qnli-dev
metrics:
- type: cosine_accuracy
value: 0.68359375
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.8038332462310791
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.6764227642276421
name: Cosine F1
- type: cosine_f1_threshold
value: 0.7276865839958191
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.5488126649076517
name: Cosine Precision
- type: cosine_recall
value: 0.8813559322033898
name: Cosine Recall
- type: cosine_ap
value: 0.6915503966365267
name: Cosine Ap
- type: dot_accuracy
value: 0.68359375
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 617.9757690429688
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.6775244299674267
name: Dot F1
- type: dot_f1_threshold
value: 559.7400512695312
name: Dot F1 Threshold
- type: dot_precision
value: 0.5502645502645502
name: Dot Precision
- type: dot_recall
value: 0.8813559322033898
name: Dot Recall
- type: dot_ap
value: 0.6914604071082934
name: Dot Ap
- type: manhattan_accuracy
value: 0.681640625
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 384.64373779296875
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.67430441898527
name: Manhattan F1
- type: manhattan_f1_threshold
value: 451.5675048828125
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.5493333333333333
name: Manhattan Precision
- type: manhattan_recall
value: 0.8728813559322034
name: Manhattan Recall
- type: manhattan_ap
value: 0.6911560630995964
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.68359375
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 17.36817741394043
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.6764227642276421
name: Euclidean F1
- type: euclidean_f1_threshold
value: 20.461692810058594
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.5488126649076517
name: Euclidean Precision
- type: euclidean_recall
value: 0.8813559322033898
name: Euclidean Recall
- type: euclidean_ap
value: 0.6915804106776542
name: Euclidean Ap
- type: max_accuracy
value: 0.68359375
name: Max Accuracy
- type: max_accuracy_threshold
value: 617.9757690429688
name: Max Accuracy Threshold
- type: max_f1
value: 0.6775244299674267
name: Max F1
- type: max_f1_threshold
value: 559.7400512695312
name: Max F1 Threshold
- type: max_precision
value: 0.5502645502645502
name: Max Precision
- type: max_recall
value: 0.8813559322033898
name: Max Recall
- type: max_ap
value: 0.6915804106776542
name: Max Ap
---
# SentenceTransformer based on microsoft/deberta-v3-small
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model
(1): AdvancedWeightedPooling(
(alpha_dropout_layer): Dropout(p=0.01, inplace=False)
(gate_dropout_layer): Dropout(p=0.05, inplace=False)
(linear_cls_pj): Linear(in_features=768, out_features=768, bias=True)
(linear_cls_Qpj): Linear(in_features=768, out_features=768, bias=True)
(linear_mean_pj): Linear(in_features=768, out_features=768, bias=True)
(linear_attnOut): Linear(in_features=768, out_features=768, bias=True)
(mha): MultiheadAttention(
(out_proj): NonDynamicallyQuantizableLinear(in_features=768, out_features=768, bias=True)
)
(layernorm_output): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(layernorm_weightedPooing): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(layernorm_pjCls): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(layernorm_pjMean): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(layernorm_attnOut): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
)
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa3-s-CustomPoolin-toytest3-step1-checkpoints-tmp")
# Run inference
sentences = [
'can sweet potatoes cause itching?',
'People with a true potato allergy may react immediately after touching, peeling, or eating potatoes. Symptoms may vary from person to person, but typical symptoms of a potato allergy include: rhinitis, including itchy or stinging eyes, a runny or stuffy nose, and sneezing.',
'riding a bike does not cause pollution',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-test`
* Evaluated with [EmbeddingSimilarityEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.6368 |
| **spearman_cosine** | **0.6346** |
| pearson_manhattan | 0.6467 |
| spearman_manhattan | 0.6337 |
| pearson_euclidean | 0.6471 |
| spearman_euclidean | 0.6346 |
| pearson_dot | 0.6368 |
| spearman_dot | 0.6346 |
| pearson_max | 0.6471 |
| spearman_max | 0.6346 |
#### Binary Classification
* Dataset: `allNLI-dev`
* Evaluated with [BinaryClassificationEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
| Metric | Value |
|:-----------------------------|:----------|
| cosine_accuracy | 0.707 |
| cosine_accuracy_threshold | 0.9086 |
| cosine_f1 | 0.5522 |
| cosine_f1_threshold | 0.8629 |
| cosine_precision | 0.4847 |
| cosine_recall | 0.6416 |
| cosine_ap | 0.543 |
| dot_accuracy | 0.709 |
| dot_accuracy_threshold | 698.3016 |
| dot_f1 | 0.5522 |
| dot_f1_threshold | 663.3273 |
| dot_precision | 0.4847 |
| dot_recall | 0.6416 |
| dot_ap | 0.543 |
| manhattan_accuracy | 0.7051 |
| manhattan_accuracy_threshold | 263.4868 |
| manhattan_f1 | 0.5521 |
| manhattan_f1_threshold | 326.1763 |
| manhattan_precision | 0.475 |
| manhattan_recall | 0.659 |
| manhattan_ap | 0.5414 |
| euclidean_accuracy | 0.707 |
| euclidean_accuracy_threshold | 11.854 |
| euclidean_f1 | 0.5522 |
| euclidean_f1_threshold | 14.517 |
| euclidean_precision | 0.4847 |
| euclidean_recall | 0.6416 |
| euclidean_ap | 0.543 |
| max_accuracy | 0.709 |
| max_accuracy_threshold | 698.3016 |
| max_f1 | 0.5522 |
| max_f1_threshold | 663.3273 |
| max_precision | 0.4847 |
| max_recall | 0.659 |
| **max_ap** | **0.543** |
#### Binary Classification
* Dataset: `Qnli-dev`
* Evaluated with [BinaryClassificationEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
| Metric | Value |
|:-----------------------------|:-----------|
| cosine_accuracy | 0.6836 |
| cosine_accuracy_threshold | 0.8038 |
| cosine_f1 | 0.6764 |
| cosine_f1_threshold | 0.7277 |
| cosine_precision | 0.5488 |
| cosine_recall | 0.8814 |
| cosine_ap | 0.6916 |
| dot_accuracy | 0.6836 |
| dot_accuracy_threshold | 617.9758 |
| dot_f1 | 0.6775 |
| dot_f1_threshold | 559.7401 |
| dot_precision | 0.5503 |
| dot_recall | 0.8814 |
| dot_ap | 0.6915 |
| manhattan_accuracy | 0.6816 |
| manhattan_accuracy_threshold | 384.6437 |
| manhattan_f1 | 0.6743 |
| manhattan_f1_threshold | 451.5675 |
| manhattan_precision | 0.5493 |
| manhattan_recall | 0.8729 |
| manhattan_ap | 0.6912 |
| euclidean_accuracy | 0.6836 |
| euclidean_accuracy_threshold | 17.3682 |
| euclidean_f1 | 0.6764 |
| euclidean_f1_threshold | 20.4617 |
| euclidean_precision | 0.5488 |
| euclidean_recall | 0.8814 |
| euclidean_ap | 0.6916 |
| max_accuracy | 0.6836 |
| max_accuracy_threshold | 617.9758 |
| max_f1 | 0.6775 |
| max_f1_threshold | 559.7401 |
| max_precision | 0.5503 |
| max_recall | 0.8814 |
| **max_ap** | **0.6916** |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 32,500 training samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details |
- min: 4 tokens
- mean: 29.6 tokens
- max: 369 tokens
| - min: 2 tokens
- mean: 58.01 tokens
- max: 437 tokens
|
* Samples:
| sentence1 | sentence2 |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| The song ‘Fashion for His Love’ by Lady Gaga is a tribute to which late fashion designer?
| Fashion Of His Love by Lady Gaga Songfacts Fashion Of His Love by Lady Gaga Songfacts Songfacts Gaga explained in a tweet that this track from her Born This Way Special Edition album is about the late Alexander McQueen. The fashion designer committed suicide by hanging on February 11, 2010 and Gaga was deeply affected by the tragic death of McQueen, who was a close personal friend. That same month, she performed at the 2010 Brit Awards wearing one of his couture creations and she also paid tribute to her late friend by setting the date on the prison security cameras in her Telephone video as the same day that McQueen's body was discovered in his London home.
|
| e. in solids the atoms are closely locked in position and can only vibrate, in liquids the atoms and molecules are more loosely connected and can collide with and move past one another, while in gases the atoms or molecules are free to move independently, colliding frequently.
| Within a substance, atoms that collide frequently and move independently of one another are most likely in a gas
|
| Helen Lederer is an English comedian .
| Helen Lederer ( born 24 September 1954 ) is an English : //www.scotsman.com/news/now-or-never-1-1396369 comedian , writer and actress who emerged as part of the alternative comedy boom at the beginning of the 1980s .
|
* Loss: [GISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 1,664 evaluation samples
* Columns: sentence1
and sentence2
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | - min: 4 tokens
- mean: 29.01 tokens
- max: 367 tokens
| - min: 2 tokens
- mean: 56.14 tokens
- max: 389 tokens
|
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| What planet did the voyager 1 spacecraft visit in 1980?
| The Voyager 1 spacecraft visited Saturn in 1980. Voyager 2 followed in 1981. These probes sent back detailed pictures of Saturn, its rings, and some of its moons ( Figure below ). From the Voyager data, we learned what Saturn’s rings are made of. They are particles of water and ice with a little bit of dust. There are several gaps in the rings. These gaps were cleared out by moons within the rings. Gravity attracts dust and gas to the moon from the ring. This leaves a gap in the rings. Other gaps in the rings are caused by the competing forces of Saturn and its moons outside the rings.
|
| Diffusion Diffusion is a process where atoms or molecules move from areas of high concentration to areas of low concentration.
| Diffusion is the process in which a substance naturally moves from an area of higher to lower concentration.
|
| Who had an 80s No 1 with Don't You Want Me?
| The Human League - Don't You Want Me - YouTube The Human League - Don't You Want Me Want to watch this again later? Sign in to add this video to a playlist. Need to report the video? Sign in to report inappropriate content. Rating is available when the video has been rented. This feature is not available right now. Please try again later. Uploaded on Feb 27, 2009 Music video by The Human League performing Don't You Want Me (2003 Digital Remaster). Category
|
* Loss: [GISTEmbedLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 256
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06}
- `warmup_ratio`: 0.33
- `save_safetensors`: False
- `fp16`: True
- `push_to_hub`: True
- `hub_model_id`: bobox/DeBERTa3-s-CustomPoolin-toytest3-step1-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `batch_sampler`: no_duplicates
#### All Hyperparameters
Click to expand
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 256
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 3.3333333333333337e-06}
- `warmup_ratio`: 0.33
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: False
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: bobox/DeBERTa3-s-CustomPoolin-toytest3-step1-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
### Training Logs
Click to expand
| Epoch | Step | Training Loss | Validation Loss | sts-test_spearman_cosine | allNLI-dev_max_ap | Qnli-dev_max_ap |
|:------:|:----:|:-------------:|:---------------:|:------------------------:|:-----------------:|:---------------:|
| 0.0010 | 1 | 10.4072 | - | - | - | - |
| 0.0020 | 2 | 11.0865 | - | - | - | - |
| 0.0030 | 3 | 9.5114 | - | - | - | - |
| 0.0039 | 4 | 9.9584 | - | - | - | - |
| 0.0049 | 5 | 10.068 | - | - | - | - |
| 0.0059 | 6 | 11.0224 | - | - | - | - |
| 0.0069 | 7 | 9.7703 | - | - | - | - |
| 0.0079 | 8 | 10.5005 | - | - | - | - |
| 0.0089 | 9 | 10.1987 | - | - | - | - |
| 0.0098 | 10 | 10.0277 | - | - | - | - |
| 0.0108 | 11 | 10.6965 | - | - | - | - |
| 0.0118 | 12 | 10.0609 | - | - | - | - |
| 0.0128 | 13 | 11.6214 | - | - | - | - |
| 0.0138 | 14 | 9.4053 | - | - | - | - |
| 0.0148 | 15 | 10.4014 | - | - | - | - |
| 0.0157 | 16 | 10.4119 | - | - | - | - |
| 0.0167 | 17 | 9.4658 | - | - | - | - |
| 0.0177 | 18 | 9.2169 | - | - | - | - |
| 0.0187 | 19 | 11.2337 | - | - | - | - |
| 0.0197 | 20 | 11.0572 | - | - | - | - |
| 0.0207 | 21 | 11.0452 | - | - | - | - |
| 0.0217 | 22 | 10.31 | - | - | - | - |
| 0.0226 | 23 | 9.1395 | - | - | - | - |
| 0.0236 | 24 | 8.4201 | - | - | - | - |
| 0.0246 | 25 | 8.6036 | - | - | - | - |
| 0.0256 | 26 | 11.7579 | - | - | - | - |
| 0.0266 | 27 | 10.1307 | - | - | - | - |
| 0.0276 | 28 | 9.2915 | - | - | - | - |
| 0.0285 | 29 | 9.0208 | - | - | - | - |
| 0.0295 | 30 | 8.6867 | - | - | - | - |
| 0.0305 | 31 | 8.0925 | - | - | - | - |
| 0.0315 | 32 | 8.6617 | - | - | - | - |
| 0.0325 | 33 | 8.3374 | - | - | - | - |
| 0.0335 | 34 | 7.8566 | - | - | - | - |
| 0.0344 | 35 | 9.0698 | - | - | - | - |
| 0.0354 | 36 | 7.7727 | - | - | - | - |
| 0.0364 | 37 | 7.6128 | - | - | - | - |
| 0.0374 | 38 | 7.8762 | - | - | - | - |
| 0.0384 | 39 | 7.5191 | - | - | - | - |
| 0.0394 | 40 | 7.5638 | - | - | - | - |
| 0.0404 | 41 | 7.1878 | - | - | - | - |
| 0.0413 | 42 | 6.8878 | - | - | - | - |
| 0.0423 | 43 | 7.5775 | - | - | - | - |
| 0.0433 | 44 | 7.1076 | - | - | - | - |
| 0.0443 | 45 | 6.5589 | - | - | - | - |
| 0.0453 | 46 | 7.4456 | - | - | - | - |
| 0.0463 | 47 | 6.8233 | - | - | - | - |
| 0.0472 | 48 | 6.7633 | - | - | - | - |
| 0.0482 | 49 | 6.6024 | - | - | - | - |
| 0.0492 | 50 | 6.2778 | - | - | - | - |
| 0.0502 | 51 | 6.1026 | - | - | - | - |
| 0.0512 | 52 | 6.632 | - | - | - | - |
| 0.0522 | 53 | 6.6962 | - | - | - | - |
| 0.0531 | 54 | 5.8514 | - | - | - | - |
| 0.0541 | 55 | 5.9951 | - | - | - | - |
| 0.0551 | 56 | 5.4554 | - | - | - | - |
| 0.0561 | 57 | 6.0147 | - | - | - | - |
| 0.0571 | 58 | 5.215 | - | - | - | - |
| 0.0581 | 59 | 6.4525 | - | - | - | - |
| 0.0591 | 60 | 5.4048 | - | - | - | - |
| 0.0600 | 61 | 5.0424 | - | - | - | - |
| 0.0610 | 62 | 6.2646 | - | - | - | - |
| 0.0620 | 63 | 5.0847 | - | - | - | - |
| 0.0630 | 64 | 5.4415 | - | - | - | - |
| 0.0640 | 65 | 5.2469 | - | - | - | - |
| 0.0650 | 66 | 5.1378 | - | - | - | - |
| 0.0659 | 67 | 5.1636 | - | - | - | - |
| 0.0669 | 68 | 5.5596 | - | - | - | - |
| 0.0679 | 69 | 4.9508 | - | - | - | - |
| 0.0689 | 70 | 5.2355 | - | - | - | - |
| 0.0699 | 71 | 4.7359 | - | - | - | - |
| 0.0709 | 72 | 4.8947 | - | - | - | - |
| 0.0719 | 73 | 4.6269 | - | - | - | - |
| 0.0728 | 74 | 4.6072 | - | - | - | - |
| 0.0738 | 75 | 4.9125 | - | - | - | - |
| 0.0748 | 76 | 4.5856 | - | - | - | - |
| 0.0758 | 77 | 4.7879 | - | - | - | - |
| 0.0768 | 78 | 4.5348 | - | - | - | - |
| 0.0778 | 79 | 4.3554 | - | - | - | - |
| 0.0787 | 80 | 4.2984 | - | - | - | - |
| 0.0797 | 81 | 4.5505 | - | - | - | - |
| 0.0807 | 82 | 4.5325 | - | - | - | - |
| 0.0817 | 83 | 4.2725 | - | - | - | - |
| 0.0827 | 84 | 4.3054 | - | - | - | - |
| 0.0837 | 85 | 4.5536 | - | - | - | - |
| 0.0846 | 86 | 4.0265 | - | - | - | - |
| 0.0856 | 87 | 4.7453 | - | - | - | - |
| 0.0866 | 88 | 4.071 | - | - | - | - |
| 0.0876 | 89 | 4.1582 | - | - | - | - |
| 0.0886 | 90 | 4.1131 | - | - | - | - |
| 0.0896 | 91 | 3.6582 | - | - | - | - |
| 0.0906 | 92 | 4.143 | - | - | - | - |
| 0.0915 | 93 | 4.2273 | - | - | - | - |
| 0.0925 | 94 | 3.9321 | - | - | - | - |
| 0.0935 | 95 | 4.2061 | - | - | - | - |
| 0.0945 | 96 | 4.1042 | - | - | - | - |
| 0.0955 | 97 | 3.9513 | - | - | - | - |
| 0.0965 | 98 | 3.8627 | - | - | - | - |
| 0.0974 | 99 | 4.3613 | - | - | - | - |
| 0.0984 | 100 | 3.8513 | - | - | - | - |
| 0.0994 | 101 | 3.5866 | - | - | - | - |
| 0.1004 | 102 | 3.5239 | - | - | - | - |
| 0.1014 | 103 | 3.5921 | - | - | - | - |
| 0.1024 | 104 | 3.5962 | - | - | - | - |
| 0.1033 | 105 | 4.0001 | - | - | - | - |
| 0.1043 | 106 | 4.1374 | - | - | - | - |
| 0.1053 | 107 | 3.9049 | - | - | - | - |
| 0.1063 | 108 | 3.2511 | - | - | - | - |
| 0.1073 | 109 | 3.2479 | - | - | - | - |
| 0.1083 | 110 | 3.6414 | - | - | - | - |
| 0.1093 | 111 | 3.6429 | - | - | - | - |
| 0.1102 | 112 | 3.423 | - | - | - | - |
| 0.1112 | 113 | 3.4967 | - | - | - | - |
| 0.1122 | 114 | 3.7649 | - | - | - | - |
| 0.1132 | 115 | 3.2845 | - | - | - | - |
| 0.1142 | 116 | 3.356 | - | - | - | - |
| 0.1152 | 117 | 3.2086 | - | - | - | - |
| 0.1161 | 118 | 3.5561 | - | - | - | - |
| 0.1171 | 119 | 3.7353 | - | - | - | - |
| 0.1181 | 120 | 3.403 | - | - | - | - |
| 0.1191 | 121 | 3.1009 | - | - | - | - |
| 0.1201 | 122 | 3.2139 | - | - | - | - |
| 0.1211 | 123 | 3.3339 | - | - | - | - |
| 0.1220 | 124 | 2.9464 | - | - | - | - |
| 0.1230 | 125 | 3.3366 | - | - | - | - |
| 0.1240 | 126 | 3.0618 | - | - | - | - |
| 0.125 | 127 | 3.0092 | - | - | - | - |
| 0.1260 | 128 | 2.7152 | - | - | - | - |
| 0.1270 | 129 | 2.9423 | - | - | - | - |
| 0.1280 | 130 | 2.6569 | - | - | - | - |
| 0.1289 | 131 | 2.8469 | - | - | - | - |
| 0.1299 | 132 | 2.9089 | - | - | - | - |
| 0.1309 | 133 | 2.5809 | - | - | - | - |
| 0.1319 | 134 | 2.6987 | - | - | - | - |
| 0.1329 | 135 | 3.2518 | - | - | - | - |
| 0.1339 | 136 | 2.9145 | - | - | - | - |
| 0.1348 | 137 | 2.4809 | - | - | - | - |
| 0.1358 | 138 | 2.8264 | - | - | - | - |
| 0.1368 | 139 | 2.5724 | - | - | - | - |
| 0.1378 | 140 | 2.6949 | - | - | - | - |
| 0.1388 | 141 | 2.6925 | - | - | - | - |
| 0.1398 | 142 | 2.9311 | - | - | - | - |
| 0.1407 | 143 | 2.5667 | - | - | - | - |
| 0.1417 | 144 | 3.2471 | - | - | - | - |
| 0.1427 | 145 | 2.2441 | - | - | - | - |
| 0.1437 | 146 | 2.75 | - | - | - | - |
| 0.1447 | 147 | 2.9669 | - | - | - | - |
| 0.1457 | 148 | 2.736 | - | - | - | - |
| 0.1467 | 149 | 3.104 | - | - | - | - |
| 0.1476 | 150 | 2.2175 | - | - | - | - |
| 0.1486 | 151 | 2.7415 | - | - | - | - |
| 0.1496 | 152 | 1.8707 | - | - | - | - |
| 0.1506 | 153 | 2.5961 | 2.2653 | 0.3116 | 0.4265 | 0.6462 |
| 0.1516 | 154 | 3.1149 | - | - | - | - |
| 0.1526 | 155 | 2.2976 | - | - | - | - |
| 0.1535 | 156 | 2.4436 | - | - | - | - |
| 0.1545 | 157 | 2.8826 | - | - | - | - |
| 0.1555 | 158 | 2.3664 | - | - | - | - |
| 0.1565 | 159 | 2.2485 | - | - | - | - |
| 0.1575 | 160 | 2.5167 | - | - | - | - |
| 0.1585 | 161 | 1.7183 | - | - | - | - |
| 0.1594 | 162 | 2.1003 | - | - | - | - |
| 0.1604 | 163 | 2.5785 | - | - | - | - |
| 0.1614 | 164 | 2.8789 | - | - | - | - |
| 0.1624 | 165 | 2.3425 | - | - | - | - |
| 0.1634 | 166 | 2.0966 | - | - | - | - |
| 0.1644 | 167 | 2.1126 | - | - | - | - |
| 0.1654 | 168 | 2.1824 | - | - | - | - |
| 0.1663 | 169 | 2.2009 | - | - | - | - |
| 0.1673 | 170 | 2.3796 | - | - | - | - |
| 0.1683 | 171 | 2.3096 | - | - | - | - |
| 0.1693 | 172 | 2.7897 | - | - | - | - |
| 0.1703 | 173 | 2.2097 | - | - | - | - |
| 0.1713 | 174 | 1.7508 | - | - | - | - |
| 0.1722 | 175 | 2.353 | - | - | - | - |
| 0.1732 | 176 | 2.4276 | - | - | - | - |
| 0.1742 | 177 | 2.1016 | - | - | - | - |
| 0.1752 | 178 | 1.8461 | - | - | - | - |
| 0.1762 | 179 | 1.8104 | - | - | - | - |
| 0.1772 | 180 | 2.6023 | - | - | - | - |
| 0.1781 | 181 | 2.5261 | - | - | - | - |
| 0.1791 | 182 | 2.1053 | - | - | - | - |
| 0.1801 | 183 | 1.9712 | - | - | - | - |
| 0.1811 | 184 | 2.4693 | - | - | - | - |
| 0.1821 | 185 | 2.1119 | - | - | - | - |
| 0.1831 | 186 | 2.4797 | - | - | - | - |
| 0.1841 | 187 | 2.1587 | - | - | - | - |
| 0.1850 | 188 | 1.9578 | - | - | - | - |
| 0.1860 | 189 | 2.1368 | - | - | - | - |
| 0.1870 | 190 | 2.4212 | - | - | - | - |
| 0.1880 | 191 | 1.9591 | - | - | - | - |
| 0.1890 | 192 | 1.5816 | - | - | - | - |
| 0.1900 | 193 | 1.4029 | - | - | - | - |
| 0.1909 | 194 | 1.9385 | - | - | - | - |
| 0.1919 | 195 | 1.5596 | - | - | - | - |
| 0.1929 | 196 | 1.6663 | - | - | - | - |
| 0.1939 | 197 | 2.0026 | - | - | - | - |
| 0.1949 | 198 | 2.0046 | - | - | - | - |
| 0.1959 | 199 | 1.5016 | - | - | - | - |
| 0.1969 | 200 | 2.184 | - | - | - | - |
| 0.1978 | 201 | 2.3442 | - | - | - | - |
| 0.1988 | 202 | 2.6981 | - | - | - | - |
| 0.1998 | 203 | 2.5481 | - | - | - | - |
| 0.2008 | 204 | 2.9798 | - | - | - | - |
| 0.2018 | 205 | 2.287 | - | - | - | - |
| 0.2028 | 206 | 1.9393 | - | - | - | - |
| 0.2037 | 207 | 2.892 | - | - | - | - |
| 0.2047 | 208 | 2.26 | - | - | - | - |
| 0.2057 | 209 | 2.5911 | - | - | - | - |
| 0.2067 | 210 | 2.1239 | - | - | - | - |
| 0.2077 | 211 | 2.0683 | - | - | - | - |
| 0.2087 | 212 | 1.768 | - | - | - | - |
| 0.2096 | 213 | 2.5468 | - | - | - | - |
| 0.2106 | 214 | 1.8956 | - | - | - | - |
| 0.2116 | 215 | 2.044 | - | - | - | - |
| 0.2126 | 216 | 1.5721 | - | - | - | - |
| 0.2136 | 217 | 1.6278 | - | - | - | - |
| 0.2146 | 218 | 1.7754 | - | - | - | - |
| 0.2156 | 219 | 1.8594 | - | - | - | - |
| 0.2165 | 220 | 1.8309 | - | - | - | - |
| 0.2175 | 221 | 2.0619 | - | - | - | - |
| 0.2185 | 222 | 2.3335 | - | - | - | - |
| 0.2195 | 223 | 2.023 | - | - | - | - |
| 0.2205 | 224 | 2.1975 | - | - | - | - |
| 0.2215 | 225 | 1.9228 | - | - | - | - |
| 0.2224 | 226 | 2.3565 | - | - | - | - |
| 0.2234 | 227 | 1.896 | - | - | - | - |
| 0.2244 | 228 | 2.0912 | - | - | - | - |
| 0.2254 | 229 | 2.7703 | - | - | - | - |
| 0.2264 | 230 | 1.6988 | - | - | - | - |
| 0.2274 | 231 | 2.0406 | - | - | - | - |
| 0.2283 | 232 | 1.9288 | - | - | - | - |
| 0.2293 | 233 | 2.0457 | - | - | - | - |
| 0.2303 | 234 | 1.7061 | - | - | - | - |
| 0.2313 | 235 | 1.6244 | - | - | - | - |
| 0.2323 | 236 | 2.0241 | - | - | - | - |
| 0.2333 | 237 | 1.567 | - | - | - | - |
| 0.2343 | 238 | 1.8084 | - | - | - | - |
| 0.2352 | 239 | 2.4363 | - | - | - | - |
| 0.2362 | 240 | 1.7532 | - | - | - | - |
| 0.2372 | 241 | 2.0797 | - | - | - | - |
| 0.2382 | 242 | 1.9562 | - | - | - | - |
| 0.2392 | 243 | 1.6751 | - | - | - | - |
| 0.2402 | 244 | 2.0265 | - | - | - | - |
| 0.2411 | 245 | 1.6065 | - | - | - | - |
| 0.2421 | 246 | 1.7439 | - | - | - | - |
| 0.2431 | 247 | 2.0237 | - | - | - | - |
| 0.2441 | 248 | 1.6128 | - | - | - | - |
| 0.2451 | 249 | 1.6581 | - | - | - | - |
| 0.2461 | 250 | 2.1538 | - | - | - | - |
| 0.2470 | 251 | 2.049 | - | - | - | - |
| 0.2480 | 252 | 1.2573 | - | - | - | - |
| 0.2490 | 253 | 1.5619 | - | - | - | - |
| 0.25 | 254 | 1.2611 | - | - | - | - |
| 0.2510 | 255 | 1.3443 | - | - | - | - |
| 0.2520 | 256 | 1.3436 | - | - | - | - |
| 0.2530 | 257 | 2.8117 | - | - | - | - |
| 0.2539 | 258 | 1.7563 | - | - | - | - |
| 0.2549 | 259 | 1.3148 | - | - | - | - |
| 0.2559 | 260 | 2.0278 | - | - | - | - |
| 0.2569 | 261 | 1.2403 | - | - | - | - |
| 0.2579 | 262 | 1.588 | - | - | - | - |
| 0.2589 | 263 | 2.0071 | - | - | - | - |
| 0.2598 | 264 | 1.5312 | - | - | - | - |
| 0.2608 | 265 | 1.8641 | - | - | - | - |
| 0.2618 | 266 | 1.2933 | - | - | - | - |
| 0.2628 | 267 | 1.6262 | - | - | - | - |
| 0.2638 | 268 | 1.721 | - | - | - | - |
| 0.2648 | 269 | 1.4713 | - | - | - | - |
| 0.2657 | 270 | 1.4625 | - | - | - | - |
| 0.2667 | 271 | 1.7254 | - | - | - | - |
| 0.2677 | 272 | 1.5108 | - | - | - | - |
| 0.2687 | 273 | 2.1126 | - | - | - | - |
| 0.2697 | 274 | 1.3967 | - | - | - | - |
| 0.2707 | 275 | 1.7067 | - | - | - | - |
| 0.2717 | 276 | 1.4847 | - | - | - | - |
| 0.2726 | 277 | 1.6515 | - | - | - | - |
| 0.2736 | 278 | 0.9367 | - | - | - | - |
| 0.2746 | 279 | 2.0267 | - | - | - | - |
| 0.2756 | 280 | 1.5023 | - | - | - | - |
| 0.2766 | 281 | 1.1248 | - | - | - | - |
| 0.2776 | 282 | 1.6224 | - | - | - | - |
| 0.2785 | 283 | 1.7969 | - | - | - | - |
| 0.2795 | 284 | 2.2498 | - | - | - | - |
| 0.2805 | 285 | 1.7477 | - | - | - | - |
| 0.2815 | 286 | 1.6261 | - | - | - | - |
| 0.2825 | 287 | 2.0911 | - | - | - | - |
| 0.2835 | 288 | 1.9519 | - | - | - | - |
| 0.2844 | 289 | 1.3132 | - | - | - | - |
| 0.2854 | 290 | 2.3292 | - | - | - | - |
| 0.2864 | 291 | 1.3781 | - | - | - | - |
| 0.2874 | 292 | 1.5753 | - | - | - | - |
| 0.2884 | 293 | 1.4158 | - | - | - | - |
| 0.2894 | 294 | 2.1661 | - | - | - | - |
| 0.2904 | 295 | 1.4928 | - | - | - | - |
| 0.2913 | 296 | 2.2825 | - | - | - | - |
| 0.2923 | 297 | 1.7261 | - | - | - | - |
| 0.2933 | 298 | 1.8635 | - | - | - | - |
| 0.2943 | 299 | 0.974 | - | - | - | - |
| 0.2953 | 300 | 1.53 | - | - | - | - |
| 0.2963 | 301 | 1.5985 | - | - | - | - |
| 0.2972 | 302 | 1.2169 | - | - | - | - |
| 0.2982 | 303 | 1.771 | - | - | - | - |
| 0.2992 | 304 | 1.4506 | - | - | - | - |
| 0.3002 | 305 | 1.9496 | - | - | - | - |
| 0.3012 | 306 | 1.2436 | 1.5213 | 0.4673 | 0.4808 | 0.6993 |
| 0.3022 | 307 | 2.2057 | - | - | - | - |
| 0.3031 | 308 | 1.6786 | - | - | - | - |
| 0.3041 | 309 | 1.748 | - | - | - | - |
| 0.3051 | 310 | 1.5541 | - | - | - | - |
| 0.3061 | 311 | 2.2968 | - | - | - | - |
| 0.3071 | 312 | 1.585 | - | - | - | - |
| 0.3081 | 313 | 1.8371 | - | - | - | - |
| 0.3091 | 314 | 1.1129 | - | - | - | - |
| 0.3100 | 315 | 1.5495 | - | - | - | - |
| 0.3110 | 316 | 1.4327 | - | - | - | - |
| 0.3120 | 317 | 1.4801 | - | - | - | - |
| 0.3130 | 318 | 1.7096 | - | - | - | - |
| 0.3140 | 319 | 1.6717 | - | - | - | - |
| 0.3150 | 320 | 1.7151 | - | - | - | - |
| 0.3159 | 321 | 1.7081 | - | - | - | - |
| 0.3169 | 322 | 1.431 | - | - | - | - |
| 0.3179 | 323 | 1.5734 | - | - | - | - |
| 0.3189 | 324 | 1.7307 | - | - | - | - |
| 0.3199 | 325 | 1.0644 | - | - | - | - |
| 0.3209 | 326 | 1.0651 | - | - | - | - |
| 0.3219 | 327 | 1.4805 | - | - | - | - |
| 0.3228 | 328 | 0.839 | - | - | - | - |
| 0.3238 | 329 | 1.1801 | - | - | - | - |
| 0.3248 | 330 | 1.36 | - | - | - | - |
| 0.3258 | 331 | 1.3371 | - | - | - | - |
| 0.3268 | 332 | 1.1707 | - | - | - | - |
| 0.3278 | 333 | 1.2572 | - | - | - | - |
| 0.3287 | 334 | 1.3537 | - | - | - | - |
| 0.3297 | 335 | 1.7096 | - | - | - | - |
| 0.3307 | 336 | 1.5137 | - | - | - | - |
| 0.3317 | 337 | 1.1989 | - | - | - | - |
| 0.3327 | 338 | 1.3559 | - | - | - | - |
| 0.3337 | 339 | 1.3643 | - | - | - | - |
| 0.3346 | 340 | 1.2283 | - | - | - | - |
| 0.3356 | 341 | 1.5829 | - | - | - | - |
| 0.3366 | 342 | 1.1866 | - | - | - | - |
| 0.3376 | 343 | 1.531 | - | - | - | - |
| 0.3386 | 344 | 1.5581 | - | - | - | - |
| 0.3396 | 345 | 1.5587 | - | - | - | - |
| 0.3406 | 346 | 1.1403 | - | - | - | - |
| 0.3415 | 347 | 1.9728 | - | - | - | - |
| 0.3425 | 348 | 1.0818 | - | - | - | - |
| 0.3435 | 349 | 1.2993 | - | - | - | - |
| 0.3445 | 350 | 1.7779 | - | - | - | - |
| 0.3455 | 351 | 1.319 | - | - | - | - |
| 0.3465 | 352 | 1.9236 | - | - | - | - |
| 0.3474 | 353 | 1.3085 | - | - | - | - |
| 0.3484 | 354 | 2.2049 | - | - | - | - |
| 0.3494 | 355 | 1.3697 | - | - | - | - |
| 0.3504 | 356 | 1.5367 | - | - | - | - |
| 0.3514 | 357 | 1.2516 | - | - | - | - |
| 0.3524 | 358 | 1.6497 | - | - | - | - |
| 0.3533 | 359 | 1.2457 | - | - | - | - |
| 0.3543 | 360 | 1.2733 | - | - | - | - |
| 0.3553 | 361 | 1.4768 | - | - | - | - |
| 0.3563 | 362 | 1.1363 | - | - | - | - |
| 0.3573 | 363 | 1.5731 | - | - | - | - |
| 0.3583 | 364 | 1.0821 | - | - | - | - |
| 0.3593 | 365 | 1.1563 | - | - | - | - |
| 0.3602 | 366 | 1.8843 | - | - | - | - |
| 0.3612 | 367 | 1.2239 | - | - | - | - |
| 0.3622 | 368 | 1.4411 | - | - | - | - |
| 0.3632 | 369 | 2.1003 | - | - | - | - |
| 0.3642 | 370 | 1.6558 | - | - | - | - |
| 0.3652 | 371 | 1.6502 | - | - | - | - |
| 0.3661 | 372 | 1.7204 | - | - | - | - |
| 0.3671 | 373 | 1.7422 | - | - | - | - |
| 0.3681 | 374 | 1.3859 | - | - | - | - |
| 0.3691 | 375 | 0.8876 | - | - | - | - |
| 0.3701 | 376 | 1.2399 | - | - | - | - |
| 0.3711 | 377 | 1.1039 | - | - | - | - |
| 0.3720 | 378 | 1.733 | - | - | - | - |
| 0.3730 | 379 | 1.6897 | - | - | - | - |
| 0.3740 | 380 | 2.0532 | - | - | - | - |
| 0.375 | 381 | 1.0156 | - | - | - | - |
| 0.3760 | 382 | 0.8888 | - | - | - | - |
| 0.3770 | 383 | 1.322 | - | - | - | - |
| 0.3780 | 384 | 1.6828 | - | - | - | - |
| 0.3789 | 385 | 1.1567 | - | - | - | - |
| 0.3799 | 386 | 1.6117 | - | - | - | - |
| 0.3809 | 387 | 1.1776 | - | - | - | - |
| 0.3819 | 388 | 1.641 | - | - | - | - |
| 0.3829 | 389 | 1.3454 | - | - | - | - |
| 0.3839 | 390 | 1.4292 | - | - | - | - |
| 0.3848 | 391 | 1.2256 | - | - | - | - |
| 0.3858 | 392 | 1.08 | - | - | - | - |
| 0.3868 | 393 | 0.7436 | - | - | - | - |
| 0.3878 | 394 | 1.4112 | - | - | - | - |
| 0.3888 | 395 | 0.8917 | - | - | - | - |
| 0.3898 | 396 | 0.9955 | - | - | - | - |
| 0.3907 | 397 | 1.2867 | - | - | - | - |
| 0.3917 | 398 | 1.0683 | - | - | - | - |
| 0.3927 | 399 | 0.9355 | - | - | - | - |
| 0.3937 | 400 | 1.1153 | - | - | - | - |
| 0.3947 | 401 | 1.1724 | - | - | - | - |
| 0.3957 | 402 | 1.4069 | - | - | - | - |
| 0.3967 | 403 | 1.2546 | - | - | - | - |
| 0.3976 | 404 | 2.2862 | - | - | - | - |
| 0.3986 | 405 | 1.2316 | - | - | - | - |
| 0.3996 | 406 | 1.7876 | - | - | - | - |
| 0.4006 | 407 | 0.6936 | - | - | - | - |
| 0.4016 | 408 | 1.3852 | - | - | - | - |
| 0.4026 | 409 | 1.9046 | - | - | - | - |
| 0.4035 | 410 | 1.4972 | - | - | - | - |
| 0.4045 | 411 | 0.5531 | - | - | - | - |
| 0.4055 | 412 | 1.3685 | - | - | - | - |
| 0.4065 | 413 | 1.1367 | - | - | - | - |
| 0.4075 | 414 | 1.1304 | - | - | - | - |
| 0.4085 | 415 | 1.5953 | - | - | - | - |
| 0.4094 | 416 | 2.0308 | - | - | - | - |
| 0.4104 | 417 | 1.7275 | - | - | - | - |
| 0.4114 | 418 | 0.9921 | - | - | - | - |
| 0.4124 | 419 | 1.3418 | - | - | - | - |
| 0.4134 | 420 | 1.108 | - | - | - | - |
| 0.4144 | 421 | 1.4359 | - | - | - | - |
| 0.4154 | 422 | 1.4537 | - | - | - | - |
| 0.4163 | 423 | 0.8416 | - | - | - | - |
| 0.4173 | 424 | 0.8904 | - | - | - | - |
| 0.4183 | 425 | 0.7937 | - | - | - | - |
| 0.4193 | 426 | 0.9105 | - | - | - | - |
| 0.4203 | 427 | 1.1661 | - | - | - | - |
| 0.4213 | 428 | 0.7751 | - | - | - | - |
| 0.4222 | 429 | 0.9039 | - | - | - | - |
| 0.4232 | 430 | 1.2651 | - | - | - | - |
| 0.4242 | 431 | 1.44 | - | - | - | - |
| 0.4252 | 432 | 0.9795 | - | - | - | - |
| 0.4262 | 433 | 2.1892 | - | - | - | - |
| 0.4272 | 434 | 1.214 | - | - | - | - |
| 0.4281 | 435 | 1.185 | - | - | - | - |
| 0.4291 | 436 | 1.2501 | - | - | - | - |
| 0.4301 | 437 | 1.6432 | - | - | - | - |
| 0.4311 | 438 | 1.0203 | - | - | - | - |
| 0.4321 | 439 | 1.5179 | - | - | - | - |
| 0.4331 | 440 | 1.1445 | - | - | - | - |
| 0.4341 | 441 | 1.3099 | - | - | - | - |
| 0.4350 | 442 | 0.8856 | - | - | - | - |
| 0.4360 | 443 | 0.5869 | - | - | - | - |
| 0.4370 | 444 | 1.6335 | - | - | - | - |
| 0.4380 | 445 | 1.4134 | - | - | - | - |
| 0.4390 | 446 | 1.0244 | - | - | - | - |
| 0.4400 | 447 | 1.103 | - | - | - | - |
| 0.4409 | 448 | 0.9848 | - | - | - | - |
| 0.4419 | 449 | 1.5089 | - | - | - | - |
| 0.4429 | 450 | 1.0422 | - | - | - | - |
| 0.4439 | 451 | 1.0462 | - | - | - | - |
| 0.4449 | 452 | 1.2857 | - | - | - | - |
| 0.4459 | 453 | 1.4132 | - | - | - | - |
| 0.4469 | 454 | 1.3061 | - | - | - | - |
| 0.4478 | 455 | 1.3977 | - | - | - | - |
| 0.4488 | 456 | 1.3557 | - | - | - | - |
| 0.4498 | 457 | 1.3595 | - | - | - | - |
| 0.4508 | 458 | 0.8647 | - | - | - | - |
| 0.4518 | 459 | 1.3905 | 1.2969 | 0.5433 | 0.4937 | 0.7094 |
| 0.4528 | 460 | 0.9467 | - | - | - | - |
| 0.4537 | 461 | 1.9372 | - | - | - | - |
| 0.4547 | 462 | 0.871 | - | - | - | - |
| 0.4557 | 463 | 1.2282 | - | - | - | - |
| 0.4567 | 464 | 1.3845 | - | - | - | - |
| 0.4577 | 465 | 1.2571 | - | - | - | - |
| 0.4587 | 466 | 1.2288 | - | - | - | - |
| 0.4596 | 467 | 1.1165 | - | - | - | - |
| 0.4606 | 468 | 1.8463 | - | - | - | - |
| 0.4616 | 469 | 0.9158 | - | - | - | - |
| 0.4626 | 470 | 0.8711 | - | - | - | - |
| 0.4636 | 471 | 1.4741 | - | - | - | - |
| 0.4646 | 472 | 0.914 | - | - | - | - |
| 0.4656 | 473 | 0.9435 | - | - | - | - |
| 0.4665 | 474 | 1.0876 | - | - | - | - |
| 0.4675 | 475 | 1.2365 | - | - | - | - |
| 0.4685 | 476 | 1.1237 | - | - | - | - |
| 0.4695 | 477 | 1.0097 | - | - | - | - |
| 0.4705 | 478 | 1.1548 | - | - | - | - |
| 0.4715 | 479 | 1.3203 | - | - | - | - |
| 0.4724 | 480 | 1.2533 | - | - | - | - |
| 0.4734 | 481 | 1.093 | - | - | - | - |
| 0.4744 | 482 | 1.2591 | - | - | - | - |
| 0.4754 | 483 | 0.6764 | - | - | - | - |
| 0.4764 | 484 | 0.8922 | - | - | - | - |
| 0.4774 | 485 | 0.8524 | - | - | - | - |
| 0.4783 | 486 | 1.2777 | - | - | - | - |
| 0.4793 | 487 | 1.1682 | - | - | - | - |
| 0.4803 | 488 | 0.8617 | - | - | - | - |
| 0.4813 | 489 | 1.0303 | - | - | - | - |
| 0.4823 | 490 | 0.9843 | - | - | - | - |
| 0.4833 | 491 | 1.2951 | - | - | - | - |
| 0.4843 | 492 | 1.7889 | - | - | - | - |
| 0.4852 | 493 | 1.118 | - | - | - | - |
| 0.4862 | 494 | 0.6772 | - | - | - | - |
| 0.4872 | 495 | 1.5058 | - | - | - | - |
| 0.4882 | 496 | 1.0068 | - | - | - | - |
| 0.4892 | 497 | 0.9024 | - | - | - | - |
| 0.4902 | 498 | 1.4816 | - | - | - | - |
| 0.4911 | 499 | 0.894 | - | - | - | - |
| 0.4921 | 500 | 1.1582 | - | - | - | - |
| 0.4931 | 501 | 1.4804 | - | - | - | - |
| 0.4941 | 502 | 1.2636 | - | - | - | - |
| 0.4951 | 503 | 1.0094 | - | - | - | - |
| 0.4961 | 504 | 0.7594 | - | - | - | - |
| 0.4970 | 505 | 1.2898 | - | - | - | - |
| 0.4980 | 506 | 1.3565 | - | - | - | - |
| 0.4990 | 507 | 1.0325 | - | - | - | - |
| 0.5 | 508 | 1.0519 | - | - | - | - |
| 0.5010 | 509 | 0.9802 | - | - | - | - |
| 0.5020 | 510 | 1.1117 | - | - | - | - |
| 0.5030 | 511 | 1.3585 | - | - | - | - |
| 0.5039 | 512 | 1.0381 | - | - | - | - |
| 0.5049 | 513 | 1.0171 | - | - | - | - |
| 0.5059 | 514 | 0.5678 | - | - | - | - |
| 0.5069 | 515 | 0.9347 | - | - | - | - |
| 0.5079 | 516 | 0.6305 | - | - | - | - |
| 0.5089 | 517 | 0.7072 | - | - | - | - |
| 0.5098 | 518 | 0.9746 | - | - | - | - |
| 0.5108 | 519 | 1.1782 | - | - | - | - |
| 0.5118 | 520 | 1.1354 | - | - | - | - |
| 0.5128 | 521 | 1.5752 | - | - | - | - |
| 0.5138 | 522 | 0.5952 | - | - | - | - |
| 0.5148 | 523 | 1.1171 | - | - | - | - |
| 0.5157 | 524 | 0.8234 | - | - | - | - |
| 0.5167 | 525 | 1.6701 | - | - | - | - |
| 0.5177 | 526 | 1.2111 | - | - | - | - |
| 0.5187 | 527 | 0.8299 | - | - | - | - |
| 0.5197 | 528 | 1.5734 | - | - | - | - |
| 0.5207 | 529 | 0.9172 | - | - | - | - |
| 0.5217 | 530 | 0.8025 | - | - | - | - |
| 0.5226 | 531 | 1.1499 | - | - | - | - |
| 0.5236 | 532 | 1.0328 | - | - | - | - |
| 0.5246 | 533 | 1.1305 | - | - | - | - |
| 0.5256 | 534 | 0.6715 | - | - | - | - |
| 0.5266 | 535 | 1.1361 | - | - | - | - |
| 0.5276 | 536 | 0.9132 | - | - | - | - |
| 0.5285 | 537 | 1.2195 | - | - | - | - |
| 0.5295 | 538 | 0.3731 | - | - | - | - |
| 0.5305 | 539 | 1.0005 | - | - | - | - |
| 0.5315 | 540 | 0.5519 | - | - | - | - |
| 0.5325 | 541 | 0.7529 | - | - | - | - |
| 0.5335 | 542 | 1.7004 | - | - | - | - |
| 0.5344 | 543 | 1.4667 | - | - | - | - |
| 0.5354 | 544 | 0.8349 | - | - | - | - |
| 0.5364 | 545 | 1.5575 | - | - | - | - |
| 0.5374 | 546 | 1.1703 | - | - | - | - |
| 0.5384 | 547 | 1.01 | - | - | - | - |
| 0.5394 | 548 | 1.1114 | - | - | - | - |
| 0.5404 | 549 | 0.516 | - | - | - | - |
| 0.5413 | 550 | 1.0422 | - | - | - | - |
| 0.5423 | 551 | 1.078 | - | - | - | - |
| 0.5433 | 552 | 1.0573 | - | - | - | - |
| 0.5443 | 553 | 0.9754 | - | - | - | - |
| 0.5453 | 554 | 0.9227 | - | - | - | - |
| 0.5463 | 555 | 1.5012 | - | - | - | - |
| 0.5472 | 556 | 1.0697 | - | - | - | - |
| 0.5482 | 557 | 1.4437 | - | - | - | - |
| 0.5492 | 558 | 1.0697 | - | - | - | - |
| 0.5502 | 559 | 0.8346 | - | - | - | - |
| 0.5512 | 560 | 0.6421 | - | - | - | - |
| 0.5522 | 561 | 0.6687 | - | - | - | - |
| 0.5531 | 562 | 0.982 | - | - | - | - |
| 0.5541 | 563 | 0.9299 | - | - | - | - |
| 0.5551 | 564 | 1.5852 | - | - | - | - |
| 0.5561 | 565 | 1.2132 | - | - | - | - |
| 0.5571 | 566 | 0.8426 | - | - | - | - |
| 0.5581 | 567 | 1.0496 | - | - | - | - |
| 0.5591 | 568 | 1.0436 | - | - | - | - |
| 0.5600 | 569 | 0.806 | - | - | - | - |
| 0.5610 | 570 | 0.6396 | - | - | - | - |
| 0.5620 | 571 | 1.6315 | - | - | - | - |
| 0.5630 | 572 | 1.3286 | - | - | - | - |
| 0.5640 | 573 | 0.7682 | - | - | - | - |
| 0.5650 | 574 | 0.7861 | - | - | - | - |
| 0.5659 | 575 | 1.0368 | - | - | - | - |
| 0.5669 | 576 | 1.1497 | - | - | - | - |
| 0.5679 | 577 | 0.9691 | - | - | - | - |
| 0.5689 | 578 | 0.7447 | - | - | - | - |
| 0.5699 | 579 | 1.3933 | - | - | - | - |
| 0.5709 | 580 | 1.0668 | - | - | - | - |
| 0.5719 | 581 | 0.6065 | - | - | - | - |
| 0.5728 | 582 | 0.9566 | - | - | - | - |
| 0.5738 | 583 | 0.7957 | - | - | - | - |
| 0.5748 | 584 | 1.0232 | - | - | - | - |
| 0.5758 | 585 | 1.4559 | - | - | - | - |
| 0.5768 | 586 | 0.8003 | - | - | - | - |
| 0.5778 | 587 | 0.9504 | - | - | - | - |
| 0.5787 | 588 | 1.5257 | - | - | - | - |
| 0.5797 | 589 | 0.5798 | - | - | - | - |
| 0.5807 | 590 | 0.8169 | - | - | - | - |
| 0.5817 | 591 | 1.1131 | - | - | - | - |
| 0.5827 | 592 | 1.2498 | - | - | - | - |
| 0.5837 | 593 | 0.8541 | - | - | - | - |
| 0.5846 | 594 | 1.0848 | - | - | - | - |
| 0.5856 | 595 | 0.8909 | - | - | - | - |
| 0.5866 | 596 | 0.7572 | - | - | - | - |
| 0.5876 | 597 | 1.3636 | - | - | - | - |
| 0.5886 | 598 | 0.8493 | - | - | - | - |
| 0.5896 | 599 | 0.9594 | - | - | - | - |
| 0.5906 | 600 | 1.1143 | - | - | - | - |
| 0.5915 | 601 | 0.7093 | - | - | - | - |
| 0.5925 | 602 | 1.0542 | - | - | - | - |
| 0.5935 | 603 | 1.0621 | - | - | - | - |
| 0.5945 | 604 | 0.6916 | - | - | - | - |
| 0.5955 | 605 | 1.0125 | - | - | - | - |
| 0.5965 | 606 | 0.8425 | - | - | - | - |
| 0.5974 | 607 | 1.2868 | - | - | - | - |
| 0.5984 | 608 | 1.3505 | - | - | - | - |
| 0.5994 | 609 | 1.2699 | - | - | - | - |
| 0.6004 | 610 | 1.1798 | - | - | - | - |
| 0.6014 | 611 | 1.3607 | - | - | - | - |
| 0.6024 | 612 | 1.0807 | 1.2167 | 0.5879 | 0.5143 | 0.7076 |
| 0.6033 | 613 | 1.4339 | - | - | - | - |
| 0.6043 | 614 | 1.1194 | - | - | - | - |
| 0.6053 | 615 | 1.0682 | - | - | - | - |
| 0.6063 | 616 | 1.0429 | - | - | - | - |
| 0.6073 | 617 | 1.2554 | - | - | - | - |
| 0.6083 | 618 | 1.2466 | - | - | - | - |
| 0.6093 | 619 | 1.1207 | - | - | - | - |
| 0.6102 | 620 | 0.9822 | - | - | - | - |
| 0.6112 | 621 | 1.7369 | - | - | - | - |
| 0.6122 | 622 | 1.3305 | - | - | - | - |
| 0.6132 | 623 | 0.9064 | - | - | - | - |
| 0.6142 | 624 | 0.7123 | - | - | - | - |
| 0.6152 | 625 | 0.7461 | - | - | - | - |
| 0.6161 | 626 | 0.8082 | - | - | - | - |
| 0.6171 | 627 | 1.0113 | - | - | - | - |
| 0.6181 | 628 | 0.9483 | - | - | - | - |
| 0.6191 | 629 | 0.9269 | - | - | - | - |
| 0.6201 | 630 | 1.3134 | - | - | - | - |
| 0.6211 | 631 | 0.7253 | - | - | - | - |
| 0.6220 | 632 | 0.809 | - | - | - | - |
| 0.6230 | 633 | 1.2514 | - | - | - | - |
| 0.6240 | 634 | 0.6718 | - | - | - | - |
| 0.625 | 635 | 0.6658 | - | - | - | - |
| 0.6260 | 636 | 1.3988 | - | - | - | - |
| 0.6270 | 637 | 0.7358 | - | - | - | - |
| 0.6280 | 638 | 0.7797 | - | - | - | - |
| 0.6289 | 639 | 1.048 | - | - | - | - |
| 0.6299 | 640 | 0.9559 | - | - | - | - |
| 0.6309 | 641 | 0.4561 | - | - | - | - |
| 0.6319 | 642 | 1.1078 | - | - | - | - |
| 0.6329 | 643 | 0.9724 | - | - | - | - |
| 0.6339 | 644 | 1.0702 | - | - | - | - |
| 0.6348 | 645 | 1.0911 | - | - | - | - |
| 0.6358 | 646 | 1.1584 | - | - | - | - |
| 0.6368 | 647 | 0.9063 | - | - | - | - |
| 0.6378 | 648 | 0.5036 | - | - | - | - |
| 0.6388 | 649 | 0.8331 | - | - | - | - |
| 0.6398 | 650 | 1.0772 | - | - | - | - |
| 0.6407 | 651 | 0.7466 | - | - | - | - |
| 0.6417 | 652 | 1.1614 | - | - | - | - |
| 0.6427 | 653 | 0.6319 | - | - | - | - |
| 0.6437 | 654 | 0.7519 | - | - | - | - |
| 0.6447 | 655 | 1.1067 | - | - | - | - |
| 0.6457 | 656 | 1.2561 | - | - | - | - |
| 0.6467 | 657 | 0.6509 | - | - | - | - |
| 0.6476 | 658 | 1.0201 | - | - | - | - |
| 0.6486 | 659 | 1.6782 | - | - | - | - |
| 0.6496 | 660 | 1.3718 | - | - | - | - |
| 0.6506 | 661 | 0.6883 | - | - | - | - |
| 0.6516 | 662 | 1.0951 | - | - | - | - |
| 0.6526 | 663 | 1.2543 | - | - | - | - |
| 0.6535 | 664 | 1.2208 | - | - | - | - |
| 0.6545 | 665 | 0.6009 | - | - | - | - |
| 0.6555 | 666 | 1.1146 | - | - | - | - |
| 0.6565 | 667 | 1.0411 | - | - | - | - |
| 0.6575 | 668 | 0.6938 | - | - | - | - |
| 0.6585 | 669 | 1.0415 | - | - | - | - |
| 0.6594 | 670 | 0.4991 | - | - | - | - |
| 0.6604 | 671 | 1.4716 | - | - | - | - |
| 0.6614 | 672 | 0.745 | - | - | - | - |
| 0.6624 | 673 | 1.5687 | - | - | - | - |
| 0.6634 | 674 | 0.7606 | - | - | - | - |
| 0.6644 | 675 | 0.2446 | - | - | - | - |
| 0.6654 | 676 | 0.4829 | - | - | - | - |
| 0.6663 | 677 | 1.0112 | - | - | - | - |
| 0.6673 | 678 | 1.3718 | - | - | - | - |
| 0.6683 | 679 | 1.3441 | - | - | - | - |
| 0.6693 | 680 | 0.5089 | - | - | - | - |
| 0.6703 | 681 | 0.9052 | - | - | - | - |
| 0.6713 | 682 | 0.7006 | - | - | - | - |
| 0.6722 | 683 | 1.2755 | - | - | - | - |
| 0.6732 | 684 | 0.8308 | - | - | - | - |
| 0.6742 | 685 | 0.797 | - | - | - | - |
| 0.6752 | 686 | 0.5807 | - | - | - | - |
| 0.6762 | 687 | 0.9666 | - | - | - | - |
| 0.6772 | 688 | 1.0587 | - | - | - | - |
| 0.6781 | 689 | 1.1675 | - | - | - | - |
| 0.6791 | 690 | 0.725 | - | - | - | - |
| 0.6801 | 691 | 0.9958 | - | - | - | - |
| 0.6811 | 692 | 1.13 | - | - | - | - |
| 0.6821 | 693 | 1.6021 | - | - | - | - |
| 0.6831 | 694 | 0.8968 | - | - | - | - |
| 0.6841 | 695 | 0.9741 | - | - | - | - |
| 0.6850 | 696 | 1.1929 | - | - | - | - |
| 0.6860 | 697 | 0.6117 | - | - | - | - |
| 0.6870 | 698 | 0.9741 | - | - | - | - |
| 0.6880 | 699 | 0.9963 | - | - | - | - |
| 0.6890 | 700 | 0.6098 | - | - | - | - |
| 0.6900 | 701 | 0.9233 | - | - | - | - |
| 0.6909 | 702 | 1.4652 | - | - | - | - |
| 0.6919 | 703 | 1.3325 | - | - | - | - |
| 0.6929 | 704 | 1.1559 | - | - | - | - |
| 0.6939 | 705 | 1.021 | - | - | - | - |
| 0.6949 | 706 | 1.1437 | - | - | - | - |
| 0.6959 | 707 | 1.5533 | - | - | - | - |
| 0.6969 | 708 | 0.4733 | - | - | - | - |
| 0.6978 | 709 | 1.4539 | - | - | - | - |
| 0.6988 | 710 | 1.132 | - | - | - | - |
| 0.6998 | 711 | 1.315 | - | - | - | - |
| 0.7008 | 712 | 0.6671 | - | - | - | - |
| 0.7018 | 713 | 1.0689 | - | - | - | - |
| 0.7028 | 714 | 1.2344 | - | - | - | - |
| 0.7037 | 715 | 0.9918 | - | - | - | - |
| 0.7047 | 716 | 0.6537 | - | - | - | - |
| 0.7057 | 717 | 1.4362 | - | - | - | - |
| 0.7067 | 718 | 1.2486 | - | - | - | - |
| 0.7077 | 719 | 0.6777 | - | - | - | - |
| 0.7087 | 720 | 0.965 | - | - | - | - |
| 0.7096 | 721 | 1.1881 | - | - | - | - |
| 0.7106 | 722 | 1.2064 | - | - | - | - |
| 0.7116 | 723 | 0.5049 | - | - | - | - |
| 0.7126 | 724 | 0.7258 | - | - | - | - |
| 0.7136 | 725 | 0.458 | - | - | - | - |
| 0.7146 | 726 | 1.0756 | - | - | - | - |
| 0.7156 | 727 | 0.8171 | - | - | - | - |
| 0.7165 | 728 | 0.786 | - | - | - | - |
| 0.7175 | 729 | 1.3556 | - | - | - | - |
| 0.7185 | 730 | 1.181 | - | - | - | - |
| 0.7195 | 731 | 1.0563 | - | - | - | - |
| 0.7205 | 732 | 0.5951 | - | - | - | - |
| 0.7215 | 733 | 0.8533 | - | - | - | - |
| 0.7224 | 734 | 0.6561 | - | - | - | - |
| 0.7234 | 735 | 1.1081 | - | - | - | - |
| 0.7244 | 736 | 0.6016 | - | - | - | - |
| 0.7254 | 737 | 0.6155 | - | - | - | - |
| 0.7264 | 738 | 0.2202 | - | - | - | - |
| 0.7274 | 739 | 1.1072 | - | - | - | - |
| 0.7283 | 740 | 1.0147 | - | - | - | - |
| 0.7293 | 741 | 0.2117 | - | - | - | - |
| 0.7303 | 742 | 1.3508 | - | - | - | - |
| 0.7313 | 743 | 0.7085 | - | - | - | - |
| 0.7323 | 744 | 0.7357 | - | - | - | - |
| 0.7333 | 745 | 1.0121 | - | - | - | - |
| 0.7343 | 746 | 1.2527 | - | - | - | - |
| 0.7352 | 747 | 1.5227 | - | - | - | - |
| 0.7362 | 748 | 1.2253 | - | - | - | - |
| 0.7372 | 749 | 0.8419 | - | - | - | - |
| 0.7382 | 750 | 0.5649 | - | - | - | - |
| 0.7392 | 751 | 1.3501 | - | - | - | - |
| 0.7402 | 752 | 1.042 | - | - | - | - |
| 0.7411 | 753 | 1.1964 | - | - | - | - |
| 0.7421 | 754 | 1.1352 | - | - | - | - |
| 0.7431 | 755 | 0.8928 | - | - | - | - |
| 0.7441 | 756 | 0.7438 | - | - | - | - |
| 0.7451 | 757 | 1.4773 | - | - | - | - |
| 0.7461 | 758 | 1.196 | - | - | - | - |
| 0.7470 | 759 | 1.1562 | - | - | - | - |
| 0.7480 | 760 | 0.8362 | - | - | - | - |
| 0.7490 | 761 | 0.904 | - | - | - | - |
| 0.75 | 762 | 0.855 | - | - | - | - |
| 0.7510 | 763 | 0.748 | - | - | - | - |
| 0.7520 | 764 | 0.6261 | - | - | - | - |
| 0.7530 | 765 | 1.1903 | 1.1807 | 0.5774 | 0.5204 | 0.7123 |
| 0.7539 | 766 | 0.8415 | - | - | - | - |
| 0.7549 | 767 | 0.712 | - | - | - | - |
| 0.7559 | 768 | 1.4149 | - | - | - | - |
| 0.7569 | 769 | 0.844 | - | - | - | - |
| 0.7579 | 770 | 0.9184 | - | - | - | - |
| 0.7589 | 771 | 0.9229 | - | - | - | - |
| 0.7598 | 772 | 1.3872 | - | - | - | - |
| 0.7608 | 773 | 0.7914 | - | - | - | - |
| 0.7618 | 774 | 0.8064 | - | - | - | - |
| 0.7628 | 775 | 1.0489 | - | - | - | - |
| 0.7638 | 776 | 1.0517 | - | - | - | - |
| 0.7648 | 777 | 0.9025 | - | - | - | - |
| 0.7657 | 778 | 0.7241 | - | - | - | - |
| 0.7667 | 779 | 1.0115 | - | - | - | - |
| 0.7677 | 780 | 1.1583 | - | - | - | - |
| 0.7687 | 781 | 1.0957 | - | - | - | - |
| 0.7697 | 782 | 0.8654 | - | - | - | - |
| 0.7707 | 783 | 1.1943 | - | - | - | - |
| 0.7717 | 784 | 0.9565 | - | - | - | - |
| 0.7726 | 785 | 1.0079 | - | - | - | - |
| 0.7736 | 786 | 1.3196 | - | - | - | - |
| 0.7746 | 787 | 0.8066 | - | - | - | - |
| 0.7756 | 788 | 1.1875 | - | - | - | - |
| 0.7766 | 789 | 0.9068 | - | - | - | - |
| 0.7776 | 790 | 0.9388 | - | - | - | - |
| 0.7785 | 791 | 1.5462 | - | - | - | - |
| 0.7795 | 792 | 0.9369 | - | - | - | - |
| 0.7805 | 793 | 1.6793 | - | - | - | - |
| 0.7815 | 794 | 1.0793 | - | - | - | - |
| 0.7825 | 795 | 0.7758 | - | - | - | - |
| 0.7835 | 796 | 0.6 | - | - | - | - |
| 0.7844 | 797 | 0.7136 | - | - | - | - |
| 0.7854 | 798 | 0.813 | - | - | - | - |
| 0.7864 | 799 | 0.8777 | - | - | - | - |
| 0.7874 | 800 | 1.119 | - | - | - | - |
| 0.7884 | 801 | 0.5711 | - | - | - | - |
| 0.7894 | 802 | 0.6798 | - | - | - | - |
| 0.7904 | 803 | 0.8154 | - | - | - | - |
| 0.7913 | 804 | 0.3272 | - | - | - | - |
| 0.7923 | 805 | 0.9906 | - | - | - | - |
| 0.7933 | 806 | 1.0634 | - | - | - | - |
| 0.7943 | 807 | 0.9913 | - | - | - | - |
| 0.7953 | 808 | 1.0392 | - | - | - | - |
| 0.7963 | 809 | 0.7832 | - | - | - | - |
| 0.7972 | 810 | 0.4475 | - | - | - | - |
| 0.7982 | 811 | 0.708 | - | - | - | - |
| 0.7992 | 812 | 0.8815 | - | - | - | - |
| 0.8002 | 813 | 1.3039 | - | - | - | - |
| 0.8012 | 814 | 1.3863 | - | - | - | - |
| 0.8022 | 815 | 1.0562 | - | - | - | - |
| 0.8031 | 816 | 0.7251 | - | - | - | - |
| 0.8041 | 817 | 0.6901 | - | - | - | - |
| 0.8051 | 818 | 0.7074 | - | - | - | - |
| 0.8061 | 819 | 0.5985 | - | - | - | - |
| 0.8071 | 820 | 0.674 | - | - | - | - |
| 0.8081 | 821 | 0.6977 | - | - | - | - |
| 0.8091 | 822 | 0.6939 | - | - | - | - |
| 0.8100 | 823 | 0.7825 | - | - | - | - |
| 0.8110 | 824 | 0.9403 | - | - | - | - |
| 0.8120 | 825 | 0.5739 | - | - | - | - |
| 0.8130 | 826 | 1.2775 | - | - | - | - |
| 0.8140 | 827 | 0.7558 | - | - | - | - |
| 0.8150 | 828 | 0.9289 | - | - | - | - |
| 0.8159 | 829 | 0.7306 | - | - | - | - |
| 0.8169 | 830 | 0.8876 | - | - | - | - |
| 0.8179 | 831 | 0.9344 | - | - | - | - |
| 0.8189 | 832 | 0.8379 | - | - | - | - |
| 0.8199 | 833 | 0.3775 | - | - | - | - |
| 0.8209 | 834 | 0.4071 | - | - | - | - |
| 0.8219 | 835 | 0.5419 | - | - | - | - |
| 0.8228 | 836 | 0.7428 | - | - | - | - |
| 0.8238 | 837 | 0.905 | - | - | - | - |
| 0.8248 | 838 | 0.605 | - | - | - | - |
| 0.8258 | 839 | 1.6087 | - | - | - | - |
| 0.8268 | 840 | 0.5758 | - | - | - | - |
| 0.8278 | 841 | 0.9991 | - | - | - | - |
| 0.8287 | 842 | 1.3015 | - | - | - | - |
| 0.8297 | 843 | 0.8529 | - | - | - | - |
| 0.8307 | 844 | 0.8257 | - | - | - | - |
| 0.8317 | 845 | 0.8513 | - | - | - | - |
| 0.8327 | 846 | 0.9995 | - | - | - | - |
| 0.8337 | 847 | 1.0182 | - | - | - | - |
| 0.8346 | 848 | 0.6523 | - | - | - | - |
| 0.8356 | 849 | 0.8436 | - | - | - | - |
| 0.8366 | 850 | 1.4555 | - | - | - | - |
| 0.8376 | 851 | 0.6176 | - | - | - | - |
| 0.8386 | 852 | 1.1224 | - | - | - | - |
| 0.8396 | 853 | 0.5743 | - | - | - | - |
| 0.8406 | 854 | 0.6488 | - | - | - | - |
| 0.8415 | 855 | 0.6553 | - | - | - | - |
| 0.8425 | 856 | 1.0901 | - | - | - | - |
| 0.8435 | 857 | 1.2568 | - | - | - | - |
| 0.8445 | 858 | 0.7643 | - | - | - | - |
| 0.8455 | 859 | 0.3966 | - | - | - | - |
| 0.8465 | 860 | 0.6586 | - | - | - | - |
| 0.8474 | 861 | 0.8597 | - | - | - | - |
| 0.8484 | 862 | 1.237 | - | - | - | - |
| 0.8494 | 863 | 0.9306 | - | - | - | - |
| 0.8504 | 864 | 0.7643 | - | - | - | - |
| 0.8514 | 865 | 0.7402 | - | - | - | - |
| 0.8524 | 866 | 0.9191 | - | - | - | - |
| 0.8533 | 867 | 0.9644 | - | - | - | - |
| 0.8543 | 868 | 0.7933 | - | - | - | - |
| 0.8553 | 869 | 1.5964 | - | - | - | - |
| 0.8563 | 870 | 0.8953 | - | - | - | - |
| 0.8573 | 871 | 1.0073 | - | - | - | - |
| 0.8583 | 872 | 0.517 | - | - | - | - |
| 0.8593 | 873 | 0.8879 | - | - | - | - |
| 0.8602 | 874 | 1.5371 | - | - | - | - |
| 0.8612 | 875 | 0.9743 | - | - | - | - |
| 0.8622 | 876 | 1.0717 | - | - | - | - |
| 0.8632 | 877 | 0.6625 | - | - | - | - |
| 0.8642 | 878 | 0.8521 | - | - | - | - |
| 0.8652 | 879 | 0.7955 | - | - | - | - |
| 0.8661 | 880 | 0.9416 | - | - | - | - |
| 0.8671 | 881 | 0.8257 | - | - | - | - |
| 0.8681 | 882 | 1.3879 | - | - | - | - |
| 0.8691 | 883 | 0.9457 | - | - | - | - |
| 0.8701 | 884 | 0.891 | - | - | - | - |
| 0.8711 | 885 | 0.9427 | - | - | - | - |
| 0.8720 | 886 | 0.8526 | - | - | - | - |
| 0.8730 | 887 | 1.2298 | - | - | - | - |
| 0.8740 | 888 | 0.6241 | - | - | - | - |
| 0.875 | 889 | 0.7055 | - | - | - | - |
| 0.8760 | 890 | 0.9713 | - | - | - | - |
| 0.8770 | 891 | 1.0591 | - | - | - | - |
| 0.8780 | 892 | 1.0597 | - | - | - | - |
| 0.8789 | 893 | 1.1631 | - | - | - | - |
| 0.8799 | 894 | 0.6098 | - | - | - | - |
| 0.8809 | 895 | 1.1498 | - | - | - | - |
| 0.8819 | 896 | 0.5379 | - | - | - | - |
| 0.8829 | 897 | 0.7921 | - | - | - | - |
| 0.8839 | 898 | 0.9092 | - | - | - | - |
| 0.8848 | 899 | 1.0348 | - | - | - | - |
| 0.8858 | 900 | 0.9087 | - | - | - | - |
| 0.8868 | 901 | 1.5328 | - | - | - | - |
| 0.8878 | 902 | 0.8664 | - | - | - | - |
| 0.8888 | 903 | 0.6873 | - | - | - | - |
| 0.8898 | 904 | 1.1763 | - | - | - | - |
| 0.8907 | 905 | 1.2853 | - | - | - | - |
| 0.8917 | 906 | 0.8163 | - | - | - | - |
| 0.8927 | 907 | 0.7383 | - | - | - | - |
| 0.8937 | 908 | 0.7833 | - | - | - | - |
| 0.8947 | 909 | 1.078 | - | - | - | - |
| 0.8957 | 910 | 0.6647 | - | - | - | - |
| 0.8967 | 911 | 1.0016 | - | - | - | - |
| 0.8976 | 912 | 0.8432 | - | - | - | - |
| 0.8986 | 913 | 0.9927 | - | - | - | - |
| 0.8996 | 914 | 0.4985 | - | - | - | - |
| 0.9006 | 915 | 0.1726 | - | - | - | - |
| 0.9016 | 916 | 0.9437 | - | - | - | - |
| 0.9026 | 917 | 0.2565 | - | - | - | - |
| 0.9035 | 918 | 0.6238 | 1.1640 | 0.6064 | 0.5481 | 0.6999 |
| 0.9045 | 919 | 0.9311 | - | - | - | - |
| 0.9055 | 920 | 1.6868 | - | - | - | - |
| 0.9065 | 921 | 0.8606 | - | - | - | - |
| 0.9075 | 922 | 0.4508 | - | - | - | - |
| 0.9085 | 923 | 0.6556 | - | - | - | - |
| 0.9094 | 924 | 0.5244 | - | - | - | - |
| 0.9104 | 925 | 0.6512 | - | - | - | - |
| 0.9114 | 926 | 0.6594 | - | - | - | - |
| 0.9124 | 927 | 1.0091 | - | - | - | - |
| 0.9134 | 928 | 0.6465 | - | - | - | - |
| 0.9144 | 929 | 0.798 | - | - | - | - |
| 0.9154 | 930 | 1.0069 | - | - | - | - |
| 0.9163 | 931 | 0.7407 | - | - | - | - |
| 0.9173 | 932 | 0.7979 | - | - | - | - |
| 0.9183 | 933 | 1.688 | - | - | - | - |
| 0.9193 | 934 | 0.8505 | - | - | - | - |
| 0.9203 | 935 | 0.8101 | - | - | - | - |
| 0.9213 | 936 | 0.8542 | - | - | - | - |
| 0.9222 | 937 | 0.956 | - | - | - | - |
| 0.9232 | 938 | 0.7072 | - | - | - | - |
| 0.9242 | 939 | 0.6316 | - | - | - | - |
| 0.9252 | 940 | 1.3868 | - | - | - | - |
| 0.9262 | 941 | 0.3854 | - | - | - | - |
| 0.9272 | 942 | 1.3662 | - | - | - | - |
| 0.9281 | 943 | 1.0133 | - | - | - | - |
| 0.9291 | 944 | 0.8543 | - | - | - | - |
| 0.9301 | 945 | 0.9697 | - | - | - | - |
| 0.9311 | 946 | 0.7126 | - | - | - | - |
| 0.9321 | 947 | 1.0513 | - | - | - | - |
| 0.9331 | 948 | 0.6503 | - | - | - | - |
| 0.9341 | 949 | 0.894 | - | - | - | - |
| 0.9350 | 950 | 0.4608 | - | - | - | - |
| 0.9360 | 951 | 0.8966 | - | - | - | - |
| 0.9370 | 952 | 0.7412 | - | - | - | - |
| 0.9380 | 953 | 0.9692 | - | - | - | - |
| 0.9390 | 954 | 0.5728 | - | - | - | - |
| 0.9400 | 955 | 0.6812 | - | - | - | - |
| 0.9409 | 956 | 0.5401 | - | - | - | - |
| 0.9419 | 957 | 0.9315 | - | - | - | - |
| 0.9429 | 958 | 0.6438 | - | - | - | - |
| 0.9439 | 959 | 1.0856 | - | - | - | - |
| 0.9449 | 960 | 0.7523 | - | - | - | - |
| 0.9459 | 961 | 0.5826 | - | - | - | - |
| 0.9469 | 962 | 0.9469 | - | - | - | - |
| 0.9478 | 963 | 0.647 | - | - | - | - |
| 0.9488 | 964 | 1.8012 | - | - | - | - |
| 0.9498 | 965 | 0.6264 | - | - | - | - |
| 0.9508 | 966 | 1.2779 | - | - | - | - |
| 0.9518 | 967 | 0.8312 | - | - | - | - |
| 0.9528 | 968 | 0.6442 | - | - | - | - |
| 0.9537 | 969 | 0.5953 | - | - | - | - |
| 0.9547 | 970 | 0.4099 | - | - | - | - |
| 0.9557 | 971 | 0.9008 | - | - | - | - |
| 0.9567 | 972 | 1.4286 | - | - | - | - |
| 0.9577 | 973 | 0.9222 | - | - | - | - |
| 0.9587 | 974 | 0.8414 | - | - | - | - |
| 0.9596 | 975 | 0.7063 | - | - | - | - |
| 0.9606 | 976 | 0.6207 | - | - | - | - |
| 0.9616 | 977 | 0.8273 | - | - | - | - |
| 0.9626 | 978 | 0.8464 | - | - | - | - |
| 0.9636 | 979 | 0.8247 | - | - | - | - |
| 0.9646 | 980 | 0.7133 | - | - | - | - |
| 0.9656 | 981 | 0.7903 | - | - | - | - |
| 0.9665 | 982 | 1.0951 | - | - | - | - |
| 0.9675 | 983 | 1.099 | - | - | - | - |
| 0.9685 | 984 | 0.5865 | - | - | - | - |
| 0.9695 | 985 | 0.8717 | - | - | - | - |
| 0.9705 | 986 | 0.9441 | - | - | - | - |
| 0.9715 | 987 | 0.9627 | - | - | - | - |
| 0.9724 | 988 | 1.039 | - | - | - | - |
| 0.9734 | 989 | 1.2139 | - | - | - | - |
| 0.9744 | 990 | 0.6284 | - | - | - | - |
| 0.9754 | 991 | 0.914 | - | - | - | - |
| 0.9764 | 992 | 0.4021 | - | - | - | - |
| 0.9774 | 993 | 0.5822 | - | - | - | - |
| 0.9783 | 994 | 0.5666 | - | - | - | - |
| 0.9793 | 995 | 1.3606 | - | - | - | - |
| 0.9803 | 996 | 0.7811 | - | - | - | - |
| 0.9813 | 997 | 0.3886 | - | - | - | - |
| 0.9823 | 998 | 1.5192 | - | - | - | - |
| 0.9833 | 999 | 0.3559 | - | - | - | - |
| 0.9843 | 1000 | 1.1638 | - | - | - | - |
| 0.9852 | 1001 | 0.5369 | - | - | - | - |
| 0.9862 | 1002 | 0.6804 | - | - | - | - |
| 0.9872 | 1003 | 0.5193 | - | - | - | - |
| 0.9882 | 1004 | 1.1559 | - | - | - | - |
| 0.9892 | 1005 | 0.7566 | - | - | - | - |
| 0.9902 | 1006 | 0.6815 | - | - | - | - |
| 0.9911 | 1007 | 1.0367 | - | - | - | - |
| 0.9921 | 1008 | 0.5514 | - | - | - | - |
| 0.9931 | 1009 | 0.7299 | - | - | - | - |
| 0.9941 | 1010 | 1.1474 | - | - | - | - |
| 0.9951 | 1011 | 0.5969 | - | - | - | - |
| 0.9961 | 1012 | 0.575 | - | - | - | - |
| 0.9970 | 1013 | 0.6357 | - | - | - | - |
| 0.9980 | 1014 | 0.8671 | - | - | - | - |
| 0.9990 | 1015 | 0.7021 | - | - | - | - |
| 1.0 | 1016 | 0.6608 | - | - | - | - |
| 1.0010 | 1017 | 1.0985 | - | - | - | - |
| 1.0020 | 1018 | 1.7193 | - | - | - | - |
| 1.0030 | 1019 | 0.3447 | - | - | - | - |
| 1.0039 | 1020 | 0.6574 | - | - | - | - |
| 1.0049 | 1021 | 0.7307 | - | - | - | - |
| 1.0059 | 1022 | 0.605 | - | - | - | - |
| 1.0069 | 1023 | 0.9642 | - | - | - | - |
| 1.0079 | 1024 | 0.7899 | - | - | - | - |
| 1.0089 | 1025 | 0.6818 | - | - | - | - |
| 1.0098 | 1026 | 0.9047 | - | - | - | - |
| 1.0108 | 1027 | 1.0251 | - | - | - | - |
| 1.0118 | 1028 | 0.635 | - | - | - | - |
| 1.0128 | 1029 | 0.9896 | - | - | - | - |
| 1.0138 | 1030 | 0.8988 | - | - | - | - |
| 1.0148 | 1031 | 0.6257 | - | - | - | - |
| 1.0157 | 1032 | 0.4369 | - | - | - | - |
| 1.0167 | 1033 | 0.7827 | - | - | - | - |
| 1.0177 | 1034 | 0.9601 | - | - | - | - |
| 1.0187 | 1035 | 0.9565 | - | - | - | - |
| 1.0197 | 1036 | 0.6667 | - | - | - | - |
| 1.0207 | 1037 | 0.4217 | - | - | - | - |
| 1.0217 | 1038 | 0.7592 | - | - | - | - |
| 1.0226 | 1039 | 0.8667 | - | - | - | - |
| 1.0236 | 1040 | 0.7705 | - | - | - | - |
| 1.0246 | 1041 | 0.9951 | - | - | - | - |
| 1.0256 | 1042 | 1.1144 | - | - | - | - |
| 1.0266 | 1043 | 1.0319 | - | - | - | - |
| 1.0276 | 1044 | 1.1595 | - | - | - | - |
| 1.0285 | 1045 | 0.6343 | - | - | - | - |
| 1.0295 | 1046 | 1.2074 | - | - | - | - |
| 1.0305 | 1047 | 0.8404 | - | - | - | - |
| 1.0315 | 1048 | 1.5037 | - | - | - | - |
| 1.0325 | 1049 | 0.4995 | - | - | - | - |
| 1.0335 | 1050 | 0.568 | - | - | - | - |
| 1.0344 | 1051 | 0.7489 | - | - | - | - |
| 1.0354 | 1052 | 0.7327 | - | - | - | - |
| 1.0364 | 1053 | 1.3957 | - | - | - | - |
| 1.0374 | 1054 | 1.0428 | - | - | - | - |
| 1.0384 | 1055 | 0.7656 | - | - | - | - |
| 1.0394 | 1056 | 1.1611 | - | - | - | - |
| 1.0404 | 1057 | 0.4786 | - | - | - | - |
| 1.0413 | 1058 | 0.5765 | - | - | - | - |
| 1.0423 | 1059 | 0.9421 | - | - | - | - |
| 1.0433 | 1060 | 0.7738 | - | - | - | - |
| 1.0443 | 1061 | 0.7882 | - | - | - | - |
| 1.0453 | 1062 | 0.9898 | - | - | - | - |
| 1.0463 | 1063 | 0.7618 | - | - | - | - |
| 1.0472 | 1064 | 0.5399 | - | - | - | - |
| 1.0482 | 1065 | 0.8189 | - | - | - | - |
| 1.0492 | 1066 | 0.4776 | - | - | - | - |
| 1.0502 | 1067 | 0.4333 | - | - | - | - |
| 1.0512 | 1068 | 0.4207 | - | - | - | - |
| 1.0522 | 1069 | 1.0206 | - | - | - | - |
| 1.0531 | 1070 | 0.4865 | - | - | - | - |
| 1.0541 | 1071 | 0.897 | 1.0710 | 0.6346 | 0.5430 | 0.6916 |
| 1.0551 | 1072 | 0.8402 | - | - | - | - |
| 1.0561 | 1073 | 0.7688 | - | - | - | - |
| 1.0571 | 1074 | 0.2184 | - | - | - | - |
| 1.0581 | 1075 | 0.863 | - | - | - | - |
| 1.0591 | 1076 | 0.63 | - | - | - | - |
| 1.0600 | 1077 | 0.6715 | - | - | - | - |
| 1.0610 | 1078 | 0.5824 | - | - | - | - |
| 1.0620 | 1079 | 0.4253 | - | - | - | - |
| 1.0630 | 1080 | 0.7626 | - | - | - | - |
| 1.0640 | 1081 | 0.6314 | - | - | - | - |
| 1.0650 | 1082 | 0.6581 | - | - | - | - |
| 1.0659 | 1083 | 0.4651 | - | - | - | - |
| 1.0669 | 1084 | 1.3387 | - | - | - | - |
| 1.0679 | 1085 | 0.8808 | - | - | - | - |
| 1.0689 | 1086 | 0.7236 | - | - | - | - |
| 1.0699 | 1087 | 0.7806 | - | - | - | - |
| 1.0709 | 1088 | 1.3413 | - | - | - | - |
| 1.0719 | 1089 | 0.4676 | - | - | - | - |
| 1.0728 | 1090 | 0.3322 | - | - | - | - |
| 1.0738 | 1091 | 0.3032 | - | - | - | - |
| 1.0748 | 1092 | 0.7566 | - | - | - | - |
| 1.0758 | 1093 | 1.2515 | - | - | - | - |
| 1.0768 | 1094 | 1.1035 | - | - | - | - |
| 1.0778 | 1095 | 0.5504 | - | - | - | - |
| 1.0787 | 1096 | 1.2568 | - | - | - | - |
| 1.0797 | 1097 | 1.0059 | - | - | - | - |
| 1.0807 | 1098 | 0.9695 | - | - | - | - |
| 1.0817 | 1099 | 0.5669 | - | - | - | - |
| 1.0827 | 1100 | 0.6268 | - | - | - | - |
| 1.0837 | 1101 | 1.013 | - | - | - | - |
| 1.0846 | 1102 | 1.5633 | - | - | - | - |
| 1.0856 | 1103 | 1.3625 | - | - | - | - |
| 1.0866 | 1104 | 0.7289 | - | - | - | - |
| 1.0876 | 1105 | 1.0045 | - | - | - | - |
| 1.0886 | 1106 | 1.2376 | - | - | - | - |
| 1.0896 | 1107 | 0.4695 | - | - | - | - |
| 1.0906 | 1108 | 1.1059 | - | - | - | - |
| 1.0915 | 1109 | 0.6343 | - | - | - | - |
| 1.0925 | 1110 | 0.7101 | - | - | - | - |
| 1.0935 | 1111 | 0.6253 | - | - | - | - |
| 1.0945 | 1112 | 1.1293 | - | - | - | - |
| 1.0955 | 1113 | 0.5038 | - | - | - | - |
| 1.0965 | 1114 | 0.8907 | - | - | - | - |
| 1.0974 | 1115 | 0.553 | - | - | - | - |
| 1.0984 | 1116 | 0.8102 | - | - | - | - |
| 1.0994 | 1117 | 0.904 | - | - | - | - |
| 1.1004 | 1118 | 0.5524 | - | - | - | - |
| 1.1014 | 1119 | 1.1347 | - | - | - | - |
| 1.1024 | 1120 | 0.4371 | - | - | - | - |
| 1.1033 | 1121 | 0.875 | - | - | - | - |
| 1.1043 | 1122 | 1.3085 | - | - | - | - |
| 1.1053 | 1123 | 0.7923 | - | - | - | - |
| 1.1063 | 1124 | 0.5889 | - | - | - | - |
| 1.1073 | 1125 | 0.5114 | - | - | - | - |
| 1.1083 | 1126 | 0.6616 | - | - | - | - |
| 1.1093 | 1127 | 0.9752 | - | - | - | - |
| 1.1102 | 1128 | 0.6389 | - | - | - | - |
| 1.1112 | 1129 | 0.9866 | - | - | - | - |
| 1.1122 | 1130 | 0.365 | - | - | - | - |
| 1.1132 | 1131 | 0.6243 | - | - | - | - |
| 1.1142 | 1132 | 0.5302 | - | - | - | - |
| 1.1152 | 1133 | 0.5457 | - | - | - | - |
| 1.1161 | 1134 | 0.7722 | - | - | - | - |
| 1.1171 | 1135 | 1.2737 | - | - | - | - |
| 1.1181 | 1136 | 0.7274 | - | - | - | - |
| 1.1191 | 1137 | 0.9102 | - | - | - | - |
| 1.1201 | 1138 | 0.673 | - | - | - | - |
| 1.1211 | 1139 | 0.5895 | - | - | - | - |
| 1.1220 | 1140 | 0.6718 | - | - | - | - |
| 1.1230 | 1141 | 0.9109 | - | - | - | - |
| 1.1240 | 1142 | 0.4086 | - | - | - | - |
| 1.125 | 1143 | 0.8096 | - | - | - | - |
| 1.1260 | 1144 | 0.6197 | - | - | - | - |
| 1.1270 | 1145 | 0.6548 | - | - | - | - |
| 1.1280 | 1146 | 0.7632 | - | - | - | - |
| 1.1289 | 1147 | 0.3288 | - | - | - | - |
| 1.1299 | 1148 | 0.6114 | - | - | - | - |
| 1.1309 | 1149 | 0.4121 | - | - | - | - |
| 1.1319 | 1150 | 0.6772 | - | - | - | - |
| 1.1329 | 1151 | 0.9555 | - | - | - | - |
| 1.1339 | 1152 | 0.5712 | - | - | - | - |
| 1.1348 | 1153 | 0.8391 | - | - | - | - |
| 1.1358 | 1154 | 0.6745 | - | - | - | - |
| 1.1368 | 1155 | 0.5267 | - | - | - | - |
| 1.1378 | 1156 | 1.0252 | - | - | - | - |
| 1.1388 | 1157 | 0.4004 | - | - | - | - |
| 1.1398 | 1158 | 0.925 | - | - | - | - |
| 1.1407 | 1159 | 0.6741 | - | - | - | - |
| 1.1417 | 1160 | 0.5167 | - | - | - | - |
| 1.1427 | 1161 | 0.6953 | - | - | - | - |
| 1.1437 | 1162 | 0.5611 | - | - | - | - |
| 1.1447 | 1163 | 1.0161 | - | - | - | - |
| 1.1457 | 1164 | 1.3154 | - | - | - | - |
| 1.1467 | 1165 | 0.6765 | - | - | - | - |
| 1.1476 | 1166 | 0.8017 | - | - | - | - |
| 1.1486 | 1167 | 0.8971 | - | - | - | - |
| 1.1496 | 1168 | 0.4928 | - | - | - | - |
| 1.1506 | 1169 | 0.6463 | - | - | - | - |
| 1.1516 | 1170 | 1.1188 | - | - | - | - |
| 1.1526 | 1171 | 0.7682 | - | - | - | - |
| 1.1535 | 1172 | 0.4076 | - | - | - | - |
| 1.1545 | 1173 | 0.6429 | - | - | - | - |
| 1.1555 | 1174 | 1.1348 | - | - | - | - |
| 1.1565 | 1175 | 0.4246 | - | - | - | - |
| 1.1575 | 1176 | 0.8091 | - | - | - | - |
| 1.1585 | 1177 | 0.3452 | - | - | - | - |
| 1.1594 | 1178 | 0.7898 | - | - | - | - |
| 1.1604 | 1179 | 0.5909 | - | - | - | - |
| 1.1614 | 1180 | 1.0561 | - | - | - | - |
| 1.1624 | 1181 | 1.0296 | - | - | - | - |
| 1.1634 | 1182 | 0.5792 | - | - | - | - |
| 1.1644 | 1183 | 0.5314 | - | - | - | - |
| 1.1654 | 1184 | 0.8981 | - | - | - | - |
| 1.1663 | 1185 | 0.8561 | - | - | - | - |
| 1.1673 | 1186 | 0.6095 | - | - | - | - |
| 1.1683 | 1187 | 0.9399 | - | - | - | - |
| 1.1693 | 1188 | 1.1345 | - | - | - | - |
| 1.1703 | 1189 | 0.4627 | - | - | - | - |
| 1.1713 | 1190 | 0.6207 | - | - | - | - |
| 1.1722 | 1191 | 0.6967 | - | - | - | - |
| 1.1732 | 1192 | 0.498 | - | - | - | - |
| 1.1742 | 1193 | 0.7233 | - | - | - | - |
| 1.1752 | 1194 | 0.443 | - | - | - | - |
| 1.1762 | 1195 | 0.6022 | - | - | - | - |
| 1.1772 | 1196 | 0.5702 | - | - | - | - |
| 1.1781 | 1197 | 0.8733 | - | - | - | - |
| 1.1791 | 1198 | 0.432 | - | - | - | - |
| 1.1801 | 1199 | 0.6508 | - | - | - | - |
| 1.1811 | 1200 | 0.8595 | - | - | - | - |
| 1.1821 | 1201 | 0.6948 | - | - | - | - |
| 1.1831 | 1202 | 0.6306 | - | - | - | - |
| 1.1841 | 1203 | 0.9615 | - | - | - | - |
| 1.1850 | 1204 | 0.5652 | - | - | - | - |
| 1.1860 | 1205 | 0.4482 | - | - | - | - |
| 1.1870 | 1206 | 0.8112 | - | - | - | - |
| 1.1880 | 1207 | 0.6432 | - | - | - | - |
| 1.1890 | 1208 | 0.6797 | - | - | - | - |
| 1.1900 | 1209 | 0.4737 | - | - | - | - |
| 1.1909 | 1210 | 0.5752 | - | - | - | - |
| 1.1919 | 1211 | 0.4858 | - | - | - | - |
| 1.1929 | 1212 | 0.4213 | - | - | - | - |
| 1.1939 | 1213 | 0.3251 | - | - | - | - |
| 1.1949 | 1214 | 0.8442 | - | - | - | - |
| 1.1959 | 1215 | 0.4813 | - | - | - | - |
| 1.1969 | 1216 | 0.4635 | - | - | - | - |
| 1.1978 | 1217 | 0.4121 | - | - | - | - |
| 1.1988 | 1218 | 0.8145 | - | - | - | - |
| 1.1998 | 1219 | 1.7243 | - | - | - | - |
| 1.2008 | 1220 | 1.0789 | - | - | - | - |
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.5.0+cu121
- Accelerate: 0.34.2
- Datasets: 3.0.2
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### GISTEmbedLoss
```bibtex
@misc{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
year={2024},
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```