Edit model card

SentenceTransformer based on avsolatorio/GIST-Embedding-v0

This is a sentence-transformers model finetuned from avsolatorio/GIST-Embedding-v0. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: avsolatorio/GIST-Embedding-v0
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("wjunwei/ecommerce_text_embedding")
# Run inference
sentences = [
    'magic the gathering modern horizons  bundle gift edition  deluxe bundle with  collector booster  play boosters fullart lands  exclusive accessories give the gift of magicthe perfect present for any magic fan this gift bundle is full of exclusive accessories and the coolest cards from modern horizons  including a collector booster packed with rare cards shiny foils and altart altframe cards moderns never been more marvelousintroducing a heaping helping of exciting cards for modern one of magics most celebrated formats plus the return of competitive favorites theres something for every fan to love in modern horizons  a collector booster full of treasuresboth powerful and flashy the included collector booster is packed with only the coolest cards and is the only place fans may find foiletched cards textured foil cards or serialized cards printed with a unique number boosters for playing  building decksplay boosters are the best packs for playing magic with friends and are fun to open with a possibility of multiple rares and at least  shining foil card in every pack fullart lands  special alternateart cardevery mh bundle gift edition also comes with  traditional foil promo card featuring bundleexclusive alternate art and  of the  included land cards feature stunning fullart  foil  nonfoil exclusive accessorieseach gift bundle also comes with exclusive accessories including a special die to track your life total as you play and a sturdy box to store everything in bundle gift edition contents collector booster  play boosters  traditional foil altart card  land cards  foil  nonfoil  spindown life counter  card storage box and  reference cards',
    'backdrop stand xft photo video studio adjustable backdrop stand for parties wedding photography advertising display note balloons colorful decorations are not included meaningful family gathering cheerful and memorable birthday party romantic and warm wedding professional photography attractive advertising display and other uses just get  ft backdrop stands adjustable stable backdrop stand loading capacity up to lbkg good flexibility with width ftft  height  ft ft heavy duty spring clamp and backdrop elastic string clip to holds curtains canvas muslin projector screen or seamless paper prevent background slippage photography weight sandbag to stabilize the backdrop support system note this bag comes empty its capacity is about lbub the kit includes  pcs crossbar parts   pcs tripod bracket   pcs spring clamp   pcs backdrop elastic string clip   pcs sand bag   pc carry bag',
    'burts bees baby baby girls mittens noscratch mitts  organic cotton set of  ',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 5,864 training samples
  • Columns: sentence_0, sentence_1, and label
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 label
    type string string int
    details
    • min: 5 tokens
    • mean: 144.3 tokens
    • max: 512 tokens
    • min: 5 tokens
    • mean: 143.08 tokens
    • max: 512 tokens
    • 0: ~87.50%
    • 1: ~12.50%
  • Samples:
    sentence_0 sentence_1 label
    philips norelco shaver rechargeable electric shaver with popup trimmer s comfortcut blades get a clean shave thats comfortable on your skin rounded blade caps shield selfsharpening blades to gently cut hair just above skin level and help the shaver glide smoothly over your skin experience a convenient clean shave with heads that flex and float in directions the head adjusts to the curves of your face ensuring smooth contact with your skin without a lot of pressure d flex heads follow your faces contours for a clean shave popup trimmer for mustache and sideburns finish your look with the builtin trimmer ideal for maintaining your mustache and trimming your sideburns onetouch open for easy cleaning experience a convenient clean shave with heads that flex and float in directions the head adjusts to the curves of your face ensuring smooth contact with your skin without a lot of pressure minutes of cordless shaving from an hour charge thats about shaves or plug it in for instant continuous power philips norelco shaver rechargeable electric shaver with popup trimmer s comfortcut blades get a clean shave thats comfortable on your skin rounded blade caps shield selfsharpening blades to gently cut hair just above skin level and help the shaver glide smoothly over your skin experience a convenient clean shave with heads that flex and float in directions the head adjusts to the curves of your face ensuring smooth contact with your skin without a lot of pressure d flex heads follow your faces contours for a clean shave popup trimmer for mustache and sideburns finish your look with the builtin trimmer ideal for maintaining your mustache and trimming your sideburns onetouch open for easy cleaning experience a convenient clean shave with heads that flex and float in directions the head adjusts to the curves of your face ensuring smooth contact with your skin without a lot of pressure minutes of cordless shaving from an hour charge thats about shaves or plug it in for instant continuous power 1
    speedo girls swimsuit one piece thin straps lace tulle flower girl long dress for wedding oneck princess dresses long sleeve pageant party gown 0
    pyrex blue cup rectangular plastic cover pc pack original genuine pyrex made in the usa genuine pyrex replacement lid nonporous surface does not absorb food odors flavors or stains refrigerator microwave and toprack dishwasher safe will not fit anchor hocking products included pyrex pc blue cup rectangle plastic lids lids only containers not included kpywzer vintage leather sling bag bags for men women backpack shoulder messenger crossbody outdoor travel hiking camping tactical chest pack daypack brown 0
  • Loss: ContrastiveTensionLoss

Training Hyperparameters

Non-Default Hyperparameters

  • num_train_epochs: 5
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss
0.6821 500 7.4403
1.3643 1000 4.8536
2.0464 1500 3.8646
2.7285 2000 3.8877
3.4106 2500 3.8789
4.0928 3000 3.8052
4.7749 3500 3.8385

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.3.0+cu121
  • Accelerate: 0.31.0
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

ContrastiveTensionLoss

@inproceedings{carlsson2021semantic,
    title={Semantic Re-tuning with Contrastive Tension},
    author={Fredrik Carlsson and Amaru Cuba Gyllensten and Evangelia Gogoulou and Erik Ylip{"a}{"a} Hellqvist and Magnus Sahlgren},
    booktitle={International Conference on Learning Representations},
    year={2021},
    url={https://openreview.net/forum?id=Ov_sMNau-PF}
}
Downloads last month
6
Safetensors
Model size
109M params
Tensor type
F32
·
Inference API
This model can be loaded on Inference API (serverless).

Finetuned from