Edit model card

SentenceTransformer based on BAAI/bge-m3

This is a sentence-transformers model finetuned from BAAI/bge-m3. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-m3
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 1024 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: XLMRobertaModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("rjnClarke/bgem3-shakespeare_st_3")
# Run inference
sentences = [
    'King Henry V is preparing for an expedition to France to seek revenge on the Dauphin for mocking him, and he urges his lords to quickly gather resources and support for the impending war.',
    "That shall fly with them; for many a thousand widows\n    Shall this his mock mock of their dear husbands;    Mock mothers from their sons, mock castles down;    And some are yet ungotten and unborn    That shall have cause to curse the Dauphin's scorn.    But this lies all within the will of God,    To whom I do appeal; and in whose name,    Tell you the Dauphin, I am coming on,    To venge me as I may and to put forth    My rightful hand in a well-hallow'd cause.    So get you hence in peace; and tell the Dauphin    His jest will savour but of shallow wit,    When thousands weep more than did laugh at it.    Convey them with safe conduct. Fare you well.                                              Exeunt AMBASSADORS  EXETER. This was a merry message.  KING HENRY. We hope to make the sender blush at it.    Therefore, my lords, omit no happy hour      That may give furth'rance to our expedition;    For we have now no thought in us but France,    Save those to God, that run before our business.    Therefore let our proportions for these wars    Be soon collected, and all things thought upon    That may with reasonable swiftness ad    More feathers to our wings; for, God before,    We'll chide this Dauphin at his father's door.    Therefore let every man now task his thought    That this fair action may on foot be brought.         Exeunt\n",
    "And that great minds, of partial indulgence\n    To their benumbed wills, resist the same;    There is a law in each well-order'd nation    To curb those raging appetites that are    Most disobedient and refractory.    If Helen, then, be wife to Sparta's king-    As it is known she is-these moral laws    Of nature and of nations speak aloud    To have her back return'd. Thus to persist    In doing wrong extenuates not wrong,    But makes it much more heavy. Hector's opinion    Is this, in way of truth. Yet, ne'er the less,      My spritely brethren, I propend to you    In resolution to keep Helen still;    For 'tis a cause that hath no mean dependence    Upon our joint and several dignities.  TROILUS. Why, there you touch'd the life of our design.    Were it not glory that we more affected    Than the performance of our heaving spleens,    I would not wish a drop of Troyan blood    Spent more in her defence. But, worthy Hector,    She is a theme of honour and renown,    A spur to valiant and magnanimous deeds,    Whose present courage may beat down our foes,    And fame in time to come canonize us;    For I presume brave Hector would not lose    So rich advantage of a promis'd glory    As smiles upon the forehead of this action    For the wide world's revenue.  HECTOR. I am yours,    You valiant offspring of great Priamus.    I have a roisting challenge sent amongst      The dull and factious nobles of the Greeks    Will strike amazement to their drowsy spirits.    I was advertis'd their great general slept,\n      Whilst emulation in the army crept.\n    This, I presume, will wake him.                            Exeunt\n",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.3823
cosine_accuracy@3 0.5235
cosine_accuracy@5 0.5825
cosine_accuracy@10 0.6564
cosine_precision@1 0.3823
cosine_precision@3 0.1745
cosine_precision@5 0.1165
cosine_precision@10 0.0656
cosine_recall@1 0.3823
cosine_recall@3 0.5235
cosine_recall@5 0.5825
cosine_recall@10 0.6564
cosine_ndcg@10 0.5142
cosine_mrr@10 0.4694
cosine_map@100 0.4766
dot_accuracy@1 0.3823
dot_accuracy@3 0.5235
dot_accuracy@5 0.5825
dot_accuracy@10 0.6564
dot_precision@1 0.3823
dot_precision@3 0.1745
dot_precision@5 0.1165
dot_precision@10 0.0656
dot_recall@1 0.3823
dot_recall@3 0.5235
dot_recall@5 0.5825
dot_recall@10 0.6564
dot_ndcg@10 0.5142
dot_mrr@10 0.4694
dot_map@100 0.4766

Training Details

Training Dataset

Unnamed Dataset

  • Size: 10,352 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 10 tokens
    • mean: 26.13 tokens
    • max: 71 tokens
    • min: 19 tokens
    • mean: 408.21 tokens
    • max: 610 tokens
  • Samples:
    sentence_0 sentence_1
    Who is trying to convince Coriolanus to have mercy on Rome and its citizens? Enter CORIOLANUS with AUFIDIUS CORIOLANUS. What's the matter?
    MENENIUS. Now, you companion, I'll say an errand for you; you shall know now that I am in estimation; you shall perceive that a Jack guardant cannot office me from my son Coriolanus. Guess but by my entertainment with him if thou stand'st not i' th' state of hanging, or of some death more long in spectatorship and crueller in suffering; behold now presently, and swoon for what's to come upon thee. The glorious gods sit in hourly synod about thy particular prosperity, and love thee no worse than thy old father Menenius does! O my son! my son! thou art preparing fire for us; look thee, here's water to quench it. I was hardly moved to come to thee; but being assured none but myself could move thee, I have been blown out of your gates with sighs, and conjure thee to pardon Rome and thy petitionary countrymen. The good gods assuage thy wrath, and turn the dregs of it upon this varlet here; this, who, like a block, hath denied my access to thee. CORIOLANUS. Away! MENENIUS. How! away! CORIOLANUS. Wife, mother, child, I know not. My affairs Are servanted to others. Though I owe My revenge properly, my remission lies In Volscian breasts. That we have been familiar, Ingrate forgetfulness shall poison rather Than pity note how much. Therefore be gone. Mine ears against your suits are stronger than Your gates against my force. Yet, for I lov'd thee, Take this along; I writ it for thy sake [Gives a letter] And would have sent it. Another word, Menenius,
    I will not hear thee speak. This man, Aufidius,
    The English nobility receive sad tidings of losses in France and the need for action. Sad tidings bring I to you out of France,
    Of loss, of slaughter, and discomfiture: Guienne, Champagne, Rheims, Orleans, Paris, Guysors, Poictiers, are all quite lost. BEDFORD. What say'st thou, man, before dead Henry's corse? Speak softly, or the loss of those great towns Will make him burst his lead and rise from death. GLOUCESTER. Is Paris lost? Is Rouen yielded up? If Henry were recall'd to life again, These news would cause him once more yield the ghost. EXETER. How were they lost? What treachery was us'd? MESSENGER. No treachery, but want of men and money. Amongst the soldiers this is muttered That here you maintain several factions; And whilst a field should be dispatch'd and fought, You are disputing of your generals: One would have ling'ring wars, with little cost; Another would fly swift, but wanteth wings; A third thinks, without expense at all, By guileful fair words peace may be obtain'd. Awake, awake, English nobility! Let not sloth dim your honours, new-begot. Cropp'd are the flower-de-luces in your arms; Of England's coat one half is cut away. EXETER. Were our tears wanting to this funeral, These tidings would call forth their flowing tides. BEDFORD. Me they concern; Regent I am of France. Give me my steeled coat; I'll fight for France. Away with these disgraceful wailing robes! Wounds will I lend the French instead of eyes, To weep their intermissive miseries.
    Enter a second MESSENGER SECOND MESSENGER. Lords, view these letters full of bad
    mischance.
    What are the main locations where the characters are headed for battle? I may dispose of him.
    King. With all my heart. Prince. Then brother John of Lancaster, to you This honourable bounty shall belong. Go to the Douglas and deliver him Up to his pleasure, ransomless and free. His valour shown upon our crests today Hath taught us how to cherish such high deeds, Even in the bosom of our adversaries. John. I thank your Grace for this high courtesy, Which I shall give away immediately. King. Then this remains, that we divide our power. You, son John, and my cousin Westmoreland, Towards York shall bend you with your dearest speed To meet Northumberland and the prelate Scroop, Who, as we hear, are busily in arms. Myself and you, son Harry, will towards Wales To fight with Glendower and the Earl of March. Rebellion in this laud shall lose his sway, Meeting the check of such another day; And since this business so fair is done, Let us not leave till all our own be won. Exeunt.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss cosine_map@100
0.3864 500 0.5974 -
0.7728 1000 0.5049 -
1.0 1294 - 0.4475
1.1592 1500 0.4202 -
1.5456 2000 0.2689 -
1.9320 2500 0.2452 -
2.0 2588 - 0.4758
2.3184 3000 0.17 -
2.7048 3500 0.1301 -
3.0 3882 - 0.4766

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.42.4
  • PyTorch: 2.3.1+cu121
  • Accelerate: 0.32.1
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
2
Safetensors
Model size
568M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for rjnClarke/bgem3-shakespeare_st_3

Base model

BAAI/bge-m3
Finetuned
(123)
this model

Evaluation results