SentenceTransformer based on mixedbread-ai/mxbai-embed-large-v1
This is a sentence-transformers model finetuned from mixedbread-ai/mxbai-embed-large-v1. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: mixedbread-ai/mxbai-embed-large-v1
- Maximum Sequence Length: 128 tokens
- Output Dimensionality: 1024 tokens
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Daxtra/sbert-trained-on-whs")
# Run inference
sentences = [
'Responsible for lease negotiations and rent collections with the aim of maximising yields for rented properties.\nDealing with freedom of information requests.\nDealing with title issues.\nManage a wide range of insolvency assignments such as Fixed & Floating charge Receiverships, Members Voluntary Liquidations, Creditors Voluntary Liquidations and Court Appointed Liquidations.\nDevelopment and roll out of disposal strategies for all properties under management.\nManage the build out of a number of ghost estates on behalf of NAMA. A current project involves remediation works to 84 residential units in Co. Monaghan with a value of EUR 10 Million.\nPreparation of tenders for NAMA & other financial institutions.\nAttending meeting with borrowers and financial institutions.\nCoordinate with estate agent to ensure we are receiving maximum yields for rented properties. I am responsible for the management of in excess of 200 rental properties across various Receiverships under my remit.',
'Preparation of budgets in order to manage cash flow throughout the various assignments.\nPreparation and submission of tax and CRO returns.\nCommunicate with Solicitors, Estate Agents and Assets Managers on a daily basis to ensure properties are brought to market and sold in a timely manner.\nDrafting monthly/quarterly reports for case managers.\nReviewing tenders received and appointment of professional service firms.\nLiaising with NAMA case managers and our internal tax department in order to determine the most tax efficient manner to dispose of properties.',
'Realization of bedside rounds and teaching.\nProgram implementation and development which include: administrative and HR management; conception and implementation of information system; Conception, implementation and coordination of PMTCT program.\nMonthly report of activities.\nPlanning and Supervision of mortality and morbidity review (MMR).\nResponsible for communication with the pediatric Saint Damien Hospital and other existing programs in the same hospital.\nNote: This program run by NPFS/Saint Damien and funded by Francesca Rava foundation at\nSupervision of the staff (12 Obstetricians, 7 anesthetists, 16 nurse midwives, 6 auxiliary midwives, 1 administrative assistant , 1 data clerk etc.)\nPerformance of ultrasound\nClinical work according to day time schedule\nPerformance of surgical procedures',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 112,464 training samples
- Columns:
sentence_0
andsentence_1
- Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 type string string details - min: 8 tokens
- mean: 64.94 tokens
- max: 128 tokens
- min: 6 tokens
- mean: 64.91 tokens
- max: 128 tokens
- Samples:
sentence_0 sentence_1 Co-authored standalone Moodle site on sustainability that was marketed by the college and sold to various 3rd parties, making in excess of £ 20,000.
Expert-level knowledge in eLearning and Virtual Learning Environments through daily use, including Moodle and Mahara with the requirement to produced tailored courses on the technology to a diverse range of teaching professionals;
Delivered in excess of 20 training sessions to over 600 lecturers on a range of new innovative practices in education including innovations within the VLE which supported their continued professional development and enhanced their classroom performance with a participant satisfaction rate regularly in excess of 95%;Administered Moodle across college, supporting learner management and championing the use of core modules to support the tracking and assessment of in excess of 15000 learners.
Coordinated and managed multiple syllabuses and competency tests for advanced software development course, emphasising quality of resources to promote self-guided learning in addition to more traditional approaches, increasing intake by 400% over a 3 year period, achieving pass and completion rate to in excess of 97%;
Improved quality of eLearning resources through the delivery of training on the use of screen-recording and simulation software in content development, increasing the use of the Virtual Learning Environment from something that would serve as a repository of worksheets to a more interactive and engaging application, appearing as the top visited pages in weekly reports;
Promoted to the unique position of Head Judge in Web Design by the Government-backed National Apprenticeship Service, project managing and collaborating with a team of Expert Judges nationally setting up the timetabling and resourcing of events attended by in excess of 100 student competitors;Advising on best practices.
Installing and maintaining Windows and Linux network systems.
Installing, uninstalling, troubleshooting specific Software for hospital based equipment.Hardware and software installation and desktop support.
Solving I.T. issues for hospital staff.
Backing up Data Systems, setting RAID configurations, wiring, setting up network and proxy servers. Fixing and troubleshooting desktop computers and laptops.Analysis of data from the manufacture of finished goods, distribution of materials in the production of finished products;
Preparation of cost of production calculations for finished products;
Full maintenance of accounting, tax, and management accounting in accordance with the current legislation of Ukraine.
Accounting for cash transactions;Maintenance of personnel documents (orders, contracts, employment records);
Work with primary documents (billing, acts, account invoices, tax invoices, work in Client Bank and Privat 24, carrying out banking operations, preparation of acts of reconciliation, payroll, accounting of goods and materials);
Preparation and submission of financial, statistical, and tax reporting. - Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 24per_device_eval_batch_size
: 24num_train_epochs
: 1multi_dataset_batch_sampler
: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 24per_device_eval_batch_size
: 24per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseeval_use_gather_object
: Falsebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robin
Training Logs
Epoch | Step | Training Loss |
---|---|---|
0.0999 | 468 | - |
0.1067 | 500 | 0.432 |
0.1997 | 936 | - |
0.2134 | 1000 | 0.2153 |
0.2996 | 1404 | - |
0.3201 | 1500 | 0.1997 |
0.3995 | 1872 | - |
0.4268 | 2000 | 0.1635 |
0.4994 | 2340 | - |
0.5335 | 2500 | 0.1573 |
0.5992 | 2808 | - |
0.6402 | 3000 | 0.1518 |
0.6991 | 3276 | - |
0.7469 | 3500 | 0.1359 |
0.7990 | 3744 | - |
0.8536 | 4000 | 0.1351 |
0.8988 | 4212 | - |
0.9603 | 4500 | 0.1187 |
0.9987 | 4680 | - |
1.0 | 4686 | - |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.2.0
- Transformers: 4.44.2
- PyTorch: 2.4.1+cu121
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 11
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Daxtra/sbert-trained-on-whs
Base model
mixedbread-ai/mxbai-embed-large-v1