Edit model card

SentenceTransformer

This is a sentence-transformers model trained. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Maximum Sequence Length: 1024 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("pankajrajdeo/2812371_bioformer_16L")
# Run inference
sentences = [
    'Albendazol',
    'SKF-92058',
    'C0130494',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 187,491,593 training samples
  • Columns: anchor, positive, negative_id, positive_id, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative_id positive_id negative
    type string string string string string
    details
    • min: 3 tokens
    • mean: 13.27 tokens
    • max: 247 tokens
    • min: 3 tokens
    • mean: 12.25 tokens
    • max: 157 tokens
    • min: 5 tokens
    • mean: 6.27 tokens
    • max: 7 tokens
    • min: 5 tokens
    • mean: 6.49 tokens
    • max: 7 tokens
    • min: 3 tokens
    • mean: 13.53 tokens
    • max: 118 tokens
  • Samples:
    anchor positive negative_id positive_id negative
    Zaburzenie metabolizmu minerałów Distúrbio não especificado do metabolismo de minerais C2887914 C0154260 Acute alcoholic hepatic failure
    testy funkčnosti placenty Metoder som brukes til å vurdere morkakefunksjon. C2350391 C0032049 Hjärtmuskelscintigrafi
    Tsefapiriin:Susc:Pt:Is:OrdQn cefapirina:susceptibilidad:punto en el tiempo:cepa clínica:ordinal o cuantitativo: C0942365 C0801894 2 proyecciones:hallazgo:punto en el tiempo:tobillo.izquierdo:Narrativo:radiografía
  • Loss: main.CustomTripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
        "triplet_margin": 5
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 50
  • learning_rate: 2e-05
  • num_train_epochs: 5
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 50
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss
0.5000 1875000 0.1053
0.5003 1876000 0.0899
0.5006 1877000 0.0978
0.5008 1878000 0.0928
0.5011 1879000 0.0887
0.5014 1880000 0.0921
0.5016 1881000 0.0908
0.5019 1882000 0.0925
0.5022 1883000 0.0886
0.5024 1884000 0.0924
0.5027 1885000 0.0932
0.5030 1886000 0.0938
0.5032 1887000 0.0976
0.5035 1888000 0.087
0.5038 1889000 0.0882
0.5040 1890000 0.0955
0.5043 1891000 0.0927
0.5046 1892000 0.0922
0.5048 1893000 0.086
0.5051 1894000 0.0899
0.5054 1895000 0.0941
0.5056 1896000 0.0924
0.5059 1897000 0.0941
0.5062 1898000 0.0904
0.5064 1899000 0.09
0.5067 1900000 0.0928
0.5070 1901000 0.088
0.5072 1902000 0.0924
0.5075 1903000 0.0927
0.5078 1904000 0.0912
0.5080 1905000 0.0971
0.5083 1906000 0.0973
0.5086 1907000 0.0932
0.5088 1908000 0.092
0.5091 1909000 0.0894
0.5094 1910000 0.0866
0.5096 1911000 0.0951
0.5099 1912000 0.0924
0.5102 1913000 0.0913
0.5104 1914000 0.0921
0.5107 1915000 0.0915
0.5110 1916000 0.0897
0.5112 1917000 0.0932
0.5115 1918000 0.0871
0.5118 1919000 0.0872
0.5120 1920000 0.0962
0.5123 1921000 0.0902
0.5126 1922000 0.0939
0.5128 1923000 0.0873
0.5131 1924000 0.0841
0.5134 1925000 0.0863
0.5136 1926000 0.0941
0.5139 1927000 0.0905
0.5142 1928000 0.0876
0.5144 1929000 0.0866
0.5147 1930000 0.0921
0.5150 1931000 0.0973
0.5152 1932000 0.0937
0.5155 1933000 0.0899
0.5158 1934000 0.0965
0.5160 1935000 0.0942
0.5163 1936000 0.0927
0.5166 1937000 0.0897
0.5168 1938000 0.094
0.5171 1939000 0.0874
0.5174 1940000 0.0954
0.5176 1941000 0.0904
0.5179 1942000 0.0913
0.5182 1943000 0.0891
0.5184 1944000 0.0941
0.5187 1945000 0.0908
0.5190 1946000 0.0903
0.5192 1947000 0.0957
0.5195 1948000 0.0875
0.5198 1949000 0.0895
0.5200 1950000 0.0883
0.5203 1951000 0.0942
0.5206 1952000 0.091
0.5208 1953000 0.0874
0.5211 1954000 0.0921
0.5214 1955000 0.0967
0.5216 1956000 0.0962
0.5219 1957000 0.0942
0.5222 1958000 0.0818
0.5224 1959000 0.0861
0.5227 1960000 0.0849
0.5230 1961000 0.0894
0.5232 1962000 0.101
0.5235 1963000 0.0832
0.5238 1964000 0.0901
0.5240 1965000 0.0949
0.5243 1966000 0.0942
0.5246 1967000 0.0897
0.5248 1968000 0.0894
0.5251 1969000 0.0846
0.5254 1970000 0.087
0.5256 1971000 0.086
0.5259 1972000 0.086
0.5262 1973000 0.0913
0.5264 1974000 0.0916
0.5267 1975000 0.0867
0.5270 1976000 0.085
0.5272 1977000 0.0863
0.5275 1978000 0.0927
0.5278 1979000 0.0866
0.5280 1980000 0.0865
0.5283 1981000 0.0898
0.5286 1982000 0.0917
0.5288 1983000 0.0864
0.5291 1984000 0.0937
0.5294 1985000 0.0916
0.5296 1986000 0.0913
0.5299 1987000 0.0927
0.5302 1988000 0.0947
0.5304 1989000 0.0939
0.5307 1990000 0.0864
0.5310 1991000 0.0816
0.5312 1992000 0.0931
0.5315 1993000 0.0906
0.5318 1994000 0.0907
0.5320 1995000 0.0895
0.5323 1996000 0.0913
0.5326 1997000 0.0915
0.5328 1998000 0.0909
0.5331 1999000 0.0917
0.5334 2000000 0.0828
0.5336 2001000 0.0865
0.5339 2002000 0.0864
0.5342 2003000 0.0887
0.5344 2004000 0.0871
0.5347 2005000 0.0903
0.5350 2006000 0.092
0.5352 2007000 0.083
0.5355 2008000 0.0934
0.5358 2009000 0.0885
0.5360 2010000 0.0841
0.5363 2011000 0.0919
0.5366 2012000 0.0909
0.5368 2013000 0.0899
0.5371 2014000 0.0905
0.5374 2015000 0.0917
0.5376 2016000 0.0936
0.5379 2017000 0.0926
0.5382 2018000 0.0884
0.5384 2019000 0.0909
0.5387 2020000 0.0858
0.5390 2021000 0.0927
0.5392 2022000 0.0908
0.5395 2023000 0.0936
0.5398 2024000 0.0896
0.5400 2025000 0.0948
0.5403 2026000 0.091
0.5406 2027000 0.0917
0.5408 2028000 0.0866
0.5411 2029000 0.0925
0.5414 2030000 0.0846
0.5416 2031000 0.0878
0.5419 2032000 0.0792
0.5422 2033000 0.0872
0.5424 2034000 0.088
0.5427 2035000 0.0972
0.5430 2036000 0.081
0.5432 2037000 0.0901
0.5435 2038000 0.092
0.5438 2039000 0.0902
0.5440 2040000 0.091
0.5443 2041000 0.0876
0.5446 2042000 0.0799
0.5448 2043000 0.0921
0.5451 2044000 0.0823
0.5454 2045000 0.0846
0.5456 2046000 0.0863
0.5459 2047000 0.0893
0.5462 2048000 0.0829
0.5464 2049000 0.0913
0.5467 2050000 0.0956
0.5470 2051000 0.0879
0.5472 2052000 0.0849
0.5475 2053000 0.0931
0.5478 2054000 0.0822
0.5480 2055000 0.086
0.5483 2056000 0.0866
0.5486 2057000 0.0943
0.5488 2058000 0.0868
0.5491 2059000 0.0918
0.5494 2060000 0.0856
0.5496 2061000 0.0841
0.5499 2062000 0.0838
0.5502 2063000 0.0906
0.5504 2064000 0.0892
0.5507 2065000 0.092
0.5510 2066000 0.0917
0.5512 2067000 0.0929
0.5515 2068000 0.0847
0.5518 2069000 0.0862
0.5520 2070000 0.0879
0.5523 2071000 0.0867
0.5526 2072000 0.0868
0.5528 2073000 0.0911
0.5531 2074000 0.0869
0.5534 2075000 0.0858
0.5536 2076000 0.0882
0.5539 2077000 0.086
0.5542 2078000 0.0868
0.5544 2079000 0.0879
0.5547 2080000 0.0847
0.5550 2081000 0.0907
0.5552 2082000 0.0897
0.5555 2083000 0.0894
0.5558 2084000 0.0939
0.5560 2085000 0.0878
0.5563 2086000 0.0885
0.5566 2087000 0.0905
0.5568 2088000 0.092
0.5571 2089000 0.0845
0.5574 2090000 0.0854
0.5576 2091000 0.0896
0.5579 2092000 0.0858
0.5582 2093000 0.0881
0.5584 2094000 0.0891
0.5587 2095000 0.0872
0.5590 2096000 0.09
0.5592 2097000 0.0835
0.5595 2098000 0.0911
0.5598 2099000 0.0909
0.5600 2100000 0.087
0.5603 2101000 0.099
0.5606 2102000 0.0855
0.5608 2103000 0.0883
0.5611 2104000 0.0919
0.5614 2105000 0.0906
0.5616 2106000 0.0925
0.5619 2107000 0.0874
0.5622 2108000 0.0901
0.5624 2109000 0.0839
0.5627 2110000 0.0882
0.5630 2111000 0.0851
0.5632 2112000 0.0902
0.5635 2113000 0.0874
0.5638 2114000 0.0875
0.5640 2115000 0.0866
0.5643 2116000 0.0909
0.5646 2117000 0.0905
0.5648 2118000 0.0915
0.5651 2119000 0.0871
0.5654 2120000 0.0823
0.5656 2121000 0.0923
0.5659 2122000 0.0886
0.5662 2123000 0.0824
0.5664 2124000 0.0871
0.5667 2125000 0.0808
0.5670 2126000 0.0897
0.5672 2127000 0.0862
0.5675 2128000 0.0896
0.5678 2129000 0.09
0.5680 2130000 0.092
0.5683 2131000 0.0875
0.5686 2132000 0.0844
0.5688 2133000 0.0838
0.5691 2134000 0.0871
0.5694 2135000 0.0812
0.5696 2136000 0.0892
0.5699 2137000 0.0819
0.5702 2138000 0.0862
0.5704 2139000 0.0895
0.5707 2140000 0.0881
0.5710 2141000 0.0854
0.5712 2142000 0.0852
0.5715 2143000 0.0825
0.5718 2144000 0.0893
0.5720 2145000 0.0884
0.5723 2146000 0.0841
0.5726 2147000 0.0897
0.5728 2148000 0.0869
0.5731 2149000 0.0831
0.5734 2150000 0.0852
0.5736 2151000 0.0858
0.5739 2152000 0.0878
0.5742 2153000 0.0879
0.5744 2154000 0.08
0.5747 2155000 0.0893
0.5750 2156000 0.0868
0.5752 2157000 0.0835
0.5755 2158000 0.0832
0.5758 2159000 0.0896
0.5760 2160000 0.0856
0.5763 2161000 0.0857
0.5766 2162000 0.093
0.5768 2163000 0.0933
0.5771 2164000 0.0863
0.5774 2165000 0.0857
0.5776 2166000 0.0894
0.5779 2167000 0.0836
0.5782 2168000 0.0893
0.5784 2169000 0.0803
0.5787 2170000 0.081
0.5790 2171000 0.089
0.5792 2172000 0.0829
0.5795 2173000 0.0884
0.5798 2174000 0.0852
0.5800 2175000 0.0798
0.5803 2176000 0.0752
0.5806 2177000 0.0828
0.5808 2178000 0.0848
0.5811 2179000 0.0895
0.5814 2180000 0.0846
0.5816 2181000 0.0841
0.5819 2182000 0.0868
0.5822 2183000 0.0885
0.5824 2184000 0.0874
0.5827 2185000 0.0865
0.5830 2186000 0.0838
0.5832 2187000 0.081
0.5835 2188000 0.0829
0.5838 2189000 0.0801
0.5840 2190000 0.0861
0.5843 2191000 0.08
0.5846 2192000 0.0842
0.5848 2193000 0.0831
0.5851 2194000 0.0842
0.5854 2195000 0.0836
0.5856 2196000 0.0811
0.5859 2197000 0.0851
0.5862 2198000 0.0854
0.5864 2199000 0.0857
0.5867 2200000 0.089
0.5870 2201000 0.0794
0.5872 2202000 0.0908
0.5875 2203000 0.0852
0.5878 2204000 0.0866
0.5880 2205000 0.085
0.5883 2206000 0.0895
0.5886 2207000 0.089
0.5888 2208000 0.087
0.5891 2209000 0.0822
0.5894 2210000 0.09
0.5896 2211000 0.0858
0.5899 2212000 0.0836
0.5902 2213000 0.0837
0.5904 2214000 0.0881
0.5907 2215000 0.0789
0.5910 2216000 0.0796
0.5912 2217000 0.0834
0.5915 2218000 0.0839
0.5918 2219000 0.0787
0.5920 2220000 0.0825
0.5923 2221000 0.0863
0.5926 2222000 0.0862
0.5928 2223000 0.0837
0.5931 2224000 0.0781
0.5934 2225000 0.0867
0.5936 2226000 0.0897
0.5939 2227000 0.0825
0.5942 2228000 0.0798
0.5944 2229000 0.086
0.5947 2230000 0.0807
0.5950 2231000 0.0788
0.5952 2232000 0.0851
0.5955 2233000 0.0844
0.5958 2234000 0.0779
0.5960 2235000 0.0804
0.5963 2236000 0.0799
0.5966 2237000 0.0843
0.5968 2238000 0.0794
0.5971 2239000 0.0848
0.5974 2240000 0.0854
0.5976 2241000 0.0906
0.5979 2242000 0.0855
0.5982 2243000 0.0793
0.5984 2244000 0.0845
0.5987 2245000 0.0854
0.5990 2246000 0.0868
0.5992 2247000 0.0867
0.5995 2248000 0.0869
0.5998 2249000 0.0853
0.6000 2250000 0.0844
0.6003 2251000 0.089
0.6006 2252000 0.0789
0.6008 2253000 0.0808
0.6011 2254000 0.0854
0.6014 2255000 0.0856
0.6016 2256000 0.0874
0.6019 2257000 0.0893
0.6022 2258000 0.0772
0.6024 2259000 0.0804
0.6027 2260000 0.0903
0.6030 2261000 0.0883
0.6032 2262000 0.0841
0.6035 2263000 0.0862
0.6038 2264000 0.0806
0.6040 2265000 0.0839
0.6043 2266000 0.0816
0.6046 2267000 0.0851
0.6048 2268000 0.0786
0.6051 2269000 0.0815
0.6054 2270000 0.0875
0.6056 2271000 0.0813
0.6059 2272000 0.085
0.6062 2273000 0.0818
0.6064 2274000 0.0833
0.6067 2275000 0.0891
0.6070 2276000 0.0869
0.6072 2277000 0.0818
0.6075 2278000 0.0874
0.6078 2279000 0.0787
0.6080 2280000 0.0782
0.6083 2281000 0.0809
0.6086 2282000 0.083
0.6088 2283000 0.082
0.6091 2284000 0.0872
0.6094 2285000 0.0851
0.6096 2286000 0.087
0.6099 2287000 0.0848
0.6102 2288000 0.0821
0.6104 2289000 0.085
0.6107 2290000 0.0838
0.6110 2291000 0.081
0.6112 2292000 0.0809
0.6115 2293000 0.0781
0.6118 2294000 0.0796
0.6120 2295000 0.0828
0.6123 2296000 0.0833
0.6126 2297000 0.0859
0.6128 2298000 0.0824
0.6131 2299000 0.0825
0.6134 2300000 0.0909
0.6136 2301000 0.0856
0.6139 2302000 0.0827
0.6142 2303000 0.0842
0.6144 2304000 0.0798
0.6147 2305000 0.0797
0.6150 2306000 0.0812
0.6152 2307000 0.0812
0.6155 2308000 0.0897
0.6158 2309000 0.0833
0.6160 2310000 0.0835
0.6163 2311000 0.0848
0.6166 2312000 0.0858
0.6168 2313000 0.0738
0.6171 2314000 0.08
0.6174 2315000 0.0784
0.6176 2316000 0.0797
0.6179 2317000 0.0791
0.6182 2318000 0.0873
0.6184 2319000 0.0825
0.6187 2320000 0.0883
0.6190 2321000 0.084
0.6192 2322000 0.0801
0.6195 2323000 0.0856
0.6198 2324000 0.0764
0.6200 2325000 0.088
0.6203 2326000 0.0814
0.6206 2327000 0.0857
0.6208 2328000 0.0873
0.6211 2329000 0.0846
0.6214 2330000 0.0871
0.6216 2331000 0.0798
0.6219 2332000 0.0908
0.6222 2333000 0.0799
0.6224 2334000 0.0801
0.6227 2335000 0.0813
0.6230 2336000 0.0868
0.6232 2337000 0.0794
0.6235 2338000 0.0869
0.6238 2339000 0.0799
0.6240 2340000 0.0793
0.6243 2341000 0.0801
0.6246 2342000 0.0836
0.6248 2343000 0.0836
0.6251 2344000 0.0855
0.6254 2345000 0.0792
0.6256 2346000 0.0805
0.6259 2347000 0.0807
0.6262 2348000 0.0815
0.6264 2349000 0.0864
0.6267 2350000 0.0745
0.6270 2351000 0.0813
0.6272 2352000 0.0882
0.6275 2353000 0.0789
0.6278 2354000 0.0756
0.6280 2355000 0.0863
0.6283 2356000 0.0833
0.6286 2357000 0.0739
0.6288 2358000 0.081
0.6291 2359000 0.0776
0.6294 2360000 0.0805
0.6296 2361000 0.0806
0.6299 2362000 0.0882
0.6302 2363000 0.0823
0.6304 2364000 0.09
0.6307 2365000 0.0763
0.6310 2366000 0.0796
0.6312 2367000 0.0835
0.6315 2368000 0.0803
0.6318 2369000 0.084
0.6320 2370000 0.084
0.6323 2371000 0.076
0.6326 2372000 0.0749
0.6328 2373000 0.0795
0.6331 2374000 0.0813
0.6334 2375000 0.0825
0.6336 2376000 0.0829
0.6339 2377000 0.0818
0.6342 2378000 0.0797
0.6344 2379000 0.0846
0.6347 2380000 0.0832
0.6350 2381000 0.082
0.6352 2382000 0.0842
0.6355 2383000 0.0849
0.6358 2384000 0.08
0.6360 2385000 0.0805
0.6363 2386000 0.0787
0.6366 2387000 0.088
0.6368 2388000 0.0883
0.6371 2389000 0.0807
0.6374 2390000 0.0786
0.6376 2391000 0.0836
0.6379 2392000 0.0795
0.6382 2393000 0.0801
0.6384 2394000 0.085
0.6387 2395000 0.0815
0.6390 2396000 0.0845
0.6392 2397000 0.0798
0.6395 2398000 0.0836
0.6398 2399000 0.0803
0.6400 2400000 0.0817
0.6403 2401000 0.0894
0.6406 2402000 0.0809
0.6408 2403000 0.0761
0.6411 2404000 0.0809
0.6414 2405000 0.0777
0.6416 2406000 0.0794
0.6419 2407000 0.0787
0.6422 2408000 0.081
0.6424 2409000 0.0847
0.6427 2410000 0.0823
0.6430 2411000 0.0751
0.6432 2412000 0.0859
0.6435 2413000 0.0805
0.6438 2414000 0.082
0.6440 2415000 0.0861
0.6443 2416000 0.0842
0.6446 2417000 0.0876
0.6448 2418000 0.074
0.6451 2419000 0.0818
0.6454 2420000 0.0836
0.6456 2421000 0.082
0.6459 2422000 0.0749
0.6462 2423000 0.0865
0.6464 2424000 0.0809
0.6467 2425000 0.0854
0.6470 2426000 0.0829
0.6472 2427000 0.08
0.6475 2428000 0.0873
0.6478 2429000 0.0757
0.6480 2430000 0.0788
0.6483 2431000 0.082
0.6486 2432000 0.0834
0.6488 2433000 0.0795
0.6491 2434000 0.0859
0.6494 2435000 0.0839
0.6496 2436000 0.0874
0.6499 2437000 0.0812
0.6502 2438000 0.0824
0.6504 2439000 0.0794
0.6507 2440000 0.0795
0.6510 2441000 0.0826
0.6512 2442000 0.0813
0.6515 2443000 0.0788
0.6518 2444000 0.0848
0.6520 2445000 0.0826
0.6523 2446000 0.0762
0.6526 2447000 0.0802
0.6528 2448000 0.0871
0.6531 2449000 0.0803
0.6534 2450000 0.0797
0.6536 2451000 0.0842
0.6539 2452000 0.0819
0.6542 2453000 0.0848
0.6544 2454000 0.08
0.6547 2455000 0.0815
0.6550 2456000 0.0806
0.6552 2457000 0.0811
0.6555 2458000 0.0798
0.6558 2459000 0.0789
0.6560 2460000 0.0793
0.6563 2461000 0.0821
0.6566 2462000 0.0835
0.6568 2463000 0.0833
0.6571 2464000 0.0821
0.6574 2465000 0.088
0.6576 2466000 0.0822
0.6579 2467000 0.0749
0.6582 2468000 0.0787
0.6584 2469000 0.0793
0.6587 2470000 0.0793
0.6590 2471000 0.0807
0.6592 2472000 0.0767
0.6595 2473000 0.0823
0.6598 2474000 0.0867
0.6600 2475000 0.0834
0.6603 2476000 0.0821
0.6606 2477000 0.0787
0.6608 2478000 0.077
0.6611 2479000 0.0771
0.6614 2480000 0.0822
0.6616 2481000 0.0824
0.6619 2482000 0.0786
0.6622 2483000 0.0795
0.6624 2484000 0.0718
0.6627 2485000 0.0807
0.6630 2486000 0.0791
0.6632 2487000 0.0801
0.6635 2488000 0.0843
0.6638 2489000 0.0843
0.6640 2490000 0.0771
0.6643 2491000 0.083
0.6646 2492000 0.0824
0.6648 2493000 0.0841
0.6651 2494000 0.0823
0.6654 2495000 0.0795
0.6656 2496000 0.0825
0.6659 2497000 0.0803
0.6662 2498000 0.0843
0.6664 2499000 0.0787
0.6667 2500000 0.0817
0.6670 2501000 0.0816
0.6672 2502000 0.0793
0.6675 2503000 0.0823
0.6678 2504000 0.0764
0.6680 2505000 0.0782
0.6683 2506000 0.0807
0.6686 2507000 0.0824
0.6688 2508000 0.0768
0.6691 2509000 0.0859
0.6694 2510000 0.0791
0.6696 2511000 0.0789
0.6699 2512000 0.0848
0.6702 2513000 0.0749
0.6704 2514000 0.0776
0.6707 2515000 0.0735
0.6710 2516000 0.0778
0.6712 2517000 0.0801
0.6715 2518000 0.0798
0.6718 2519000 0.0784
0.6720 2520000 0.0781
0.6723 2521000 0.0818
0.6726 2522000 0.0762
0.6728 2523000 0.0806
0.6731 2524000 0.0773
0.6734 2525000 0.0772
0.6736 2526000 0.0782
0.6739 2527000 0.0767
0.6742 2528000 0.0828
0.6744 2529000 0.0829
0.6747 2530000 0.0792
0.6750 2531000 0.0797
0.6752 2532000 0.0823
0.6755 2533000 0.0772
0.6758 2534000 0.0765
0.6760 2535000 0.075
0.6763 2536000 0.0786
0.6766 2537000 0.0785
0.6768 2538000 0.0877
0.6771 2539000 0.0747
0.6774 2540000 0.0755
0.6776 2541000 0.082
0.6779 2542000 0.0759
0.6782 2543000 0.0831
0.6784 2544000 0.0811
0.6787 2545000 0.0795
0.6790 2546000 0.0852
0.6792 2547000 0.0832
0.6795 2548000 0.0793
0.6798 2549000 0.0832
0.6800 2550000 0.0799
0.6803 2551000 0.0733
0.6806 2552000 0.0809
0.6808 2553000 0.0772
0.6811 2554000 0.0801
0.6814 2555000 0.0794
0.6816 2556000 0.0792
0.6819 2557000 0.0847
0.6822 2558000 0.0748
0.6824 2559000 0.0813
0.6827 2560000 0.0741
0.6830 2561000 0.0851
0.6832 2562000 0.0763
0.6835 2563000 0.0841
0.6838 2564000 0.0762
0.6840 2565000 0.0752
0.6843 2566000 0.0857
0.6846 2567000 0.0824
0.6848 2568000 0.0762
0.6851 2569000 0.0754
0.6854 2570000 0.0795
0.6856 2571000 0.0829
0.6859 2572000 0.0839
0.6862 2573000 0.0779
0.6864 2574000 0.08
0.6867 2575000 0.0722
0.6870 2576000 0.0796
0.6872 2577000 0.0831
0.6875 2578000 0.0795
0.6878 2579000 0.0827
0.6880 2580000 0.0821
0.6883 2581000 0.074
0.6886 2582000 0.0811
0.6888 2583000 0.0758
0.6891 2584000 0.0742
0.6894 2585000 0.0744
0.6896 2586000 0.081
0.6899 2587000 0.0738
0.6902 2588000 0.0844
0.6904 2589000 0.0773
0.6907 2590000 0.0756
0.6910 2591000 0.0805
0.6912 2592000 0.0812
0.6915 2593000 0.0757
0.6918 2594000 0.0802
0.6920 2595000 0.0813
0.6923 2596000 0.0769
0.6926 2597000 0.0752
0.6928 2598000 0.0843
0.6931 2599000 0.0755
0.6934 2600000 0.0837
0.6936 2601000 0.0823
0.6939 2602000 0.0728
0.6942 2603000 0.0811
0.6944 2604000 0.0802
0.6947 2605000 0.0758
0.6950 2606000 0.0797
0.6952 2607000 0.0841
0.6955 2608000 0.0788
0.6958 2609000 0.0811
0.6960 2610000 0.0788
0.6963 2611000 0.0786
0.6966 2612000 0.0722
0.6968 2613000 0.0853
0.6971 2614000 0.0755
0.6974 2615000 0.0818
0.6976 2616000 0.0792
0.6979 2617000 0.0854
0.6982 2618000 0.0735
0.6984 2619000 0.0786
0.6987 2620000 0.0805
0.6990 2621000 0.0756
0.6992 2622000 0.0792
0.6995 2623000 0.0761
0.6998 2624000 0.0762
0.7000 2625000 0.0778
0.7003 2626000 0.0826
0.7006 2627000 0.0789
0.7008 2628000 0.0786
0.7011 2629000 0.0792
0.7014 2630000 0.0816
0.7016 2631000 0.0751
0.7019 2632000 0.0729
0.7022 2633000 0.0776
0.7024 2634000 0.0823
0.7027 2635000 0.0808
0.7030 2636000 0.079
0.7032 2637000 0.0792
0.7035 2638000 0.0761
0.7038 2639000 0.0795
0.7040 2640000 0.0806
0.7043 2641000 0.0793
0.7046 2642000 0.086
0.7048 2643000 0.0765
0.7051 2644000 0.0745
0.7054 2645000 0.0771
0.7056 2646000 0.0808
0.7059 2647000 0.0805
0.7062 2648000 0.0759
0.7064 2649000 0.0709
0.7067 2650000 0.0787
0.7070 2651000 0.08
0.7072 2652000 0.0826
0.7075 2653000 0.085
0.7078 2654000 0.08
0.7080 2655000 0.0762
0.7083 2656000 0.0769
0.7086 2657000 0.0783
0.7088 2658000 0.0837
0.7091 2659000 0.0803
0.7094 2660000 0.0809
0.7096 2661000 0.0764
0.7099 2662000 0.0791
0.7102 2663000 0.0829
0.7104 2664000 0.0767
0.7107 2665000 0.0799
0.7110 2666000 0.0789
0.7112 2667000 0.0781
0.7115 2668000 0.0813
0.7118 2669000 0.0793
0.7120 2670000 0.0793
0.7123 2671000 0.0815
0.7126 2672000 0.0816
0.7128 2673000 0.0774
0.7131 2674000 0.0785
0.7134 2675000 0.0711
0.7136 2676000 0.0799
0.7139 2677000 0.0758
0.7142 2678000 0.08
0.7144 2679000 0.081
0.7147 2680000 0.0797
0.7150 2681000 0.0798
0.7152 2682000 0.0775
0.7155 2683000 0.0766
0.7158 2684000 0.0803
0.7160 2685000 0.0743
0.7163 2686000 0.0764
0.7166 2687000 0.0773
0.7168 2688000 0.0773
0.7171 2689000 0.0769
0.7174 2690000 0.0753
0.7176 2691000 0.072
0.7179 2692000 0.0779
0.7182 2693000 0.0778
0.7184 2694000 0.0743
0.7187 2695000 0.0764
0.7190 2696000 0.0762
0.7192 2697000 0.0791
0.7195 2698000 0.0804
0.7198 2699000 0.0769
0.7200 2700000 0.0787
0.7203 2701000 0.0804
0.7206 2702000 0.0746
0.7208 2703000 0.0813
0.7211 2704000 0.0783
0.7214 2705000 0.0783
0.7216 2706000 0.0748
0.7219 2707000 0.0813
0.7222 2708000 0.0885
0.7224 2709000 0.0749
0.7227 2710000 0.0812
0.7230 2711000 0.0749
0.7232 2712000 0.0787
0.7235 2713000 0.0823
0.7238 2714000 0.0754
0.7240 2715000 0.0773
0.7243 2716000 0.0774
0.7246 2717000 0.0785
0.7248 2718000 0.0813
0.7251 2719000 0.0855
0.7254 2720000 0.0812
0.7256 2721000 0.0751
0.7259 2722000 0.0778
0.7262 2723000 0.0756
0.7264 2724000 0.0808
0.7267 2725000 0.0768
0.7270 2726000 0.0775
0.7272 2727000 0.0789
0.7275 2728000 0.077
0.7278 2729000 0.0795
0.7280 2730000 0.0805
0.7283 2731000 0.069
0.7286 2732000 0.0807
0.7288 2733000 0.0806
0.7291 2734000 0.0805
0.7294 2735000 0.0746
0.7296 2736000 0.0823
0.7299 2737000 0.0752
0.7302 2738000 0.0761
0.7304 2739000 0.079
0.7307 2740000 0.0772
0.7310 2741000 0.0781
0.7312 2742000 0.0774
0.7315 2743000 0.0805
0.7318 2744000 0.0784
0.7320 2745000 0.0783
0.7323 2746000 0.0761
0.7326 2747000 0.0772
0.7328 2748000 0.0755
0.7331 2749000 0.0733
0.7334 2750000 0.0744
0.7336 2751000 0.0737
0.7339 2752000 0.0747
0.7342 2753000 0.0742
0.7344 2754000 0.0789
0.7347 2755000 0.0788
0.7350 2756000 0.0789
0.7352 2757000 0.0763
0.7355 2758000 0.0751
0.7358 2759000 0.0745
0.7360 2760000 0.0814
0.7363 2761000 0.0792
0.7366 2762000 0.0748
0.7368 2763000 0.0822
0.7371 2764000 0.0754
0.7374 2765000 0.0765
0.7376 2766000 0.074
0.7379 2767000 0.0691
0.7382 2768000 0.0754
0.7384 2769000 0.0703
0.7387 2770000 0.0795
0.7390 2771000 0.0792
0.7392 2772000 0.0741
0.7395 2773000 0.0712
0.7398 2774000 0.0713
0.7400 2775000 0.071
0.7403 2776000 0.079
0.7406 2777000 0.0737
0.7408 2778000 0.0751
0.7411 2779000 0.074
0.7414 2780000 0.0737
0.7416 2781000 0.0814
0.7419 2782000 0.0779
0.7422 2783000 0.0769
0.7424 2784000 0.0798
0.7427 2785000 0.077
0.7430 2786000 0.0713
0.7432 2787000 0.0719
0.7435 2788000 0.0776
0.7438 2789000 0.0818
0.7440 2790000 0.0763
0.7443 2791000 0.0759
0.7446 2792000 0.0753
0.7448 2793000 0.0736
0.7451 2794000 0.0801
0.7454 2795000 0.0722
0.7456 2796000 0.081
0.7459 2797000 0.0714
0.7462 2798000 0.0762
0.7464 2799000 0.0809
0.7467 2800000 0.0816
0.7470 2801000 0.0794
0.7472 2802000 0.078
0.7475 2803000 0.0758
0.7478 2804000 0.0796
0.7480 2805000 0.0763
0.7483 2806000 0.0751
0.7486 2807000 0.0741
0.7488 2808000 0.0777
0.7491 2809000 0.0795
0.7494 2810000 0.0806
0.7496 2811000 0.0768
0.7499 2812000 0.0774

Framework Versions

  • Python: 3.12.2
  • Sentence Transformers: 3.2.1
  • Transformers: 4.46.1
  • PyTorch: 2.5.0
  • Accelerate: 1.0.1
  • Datasets: 3.0.2
  • Tokenizers: 0.20.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CustomTripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification},
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Downloads last month
6
Safetensors
Model size
41.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.