Edit model card

SetFit with BAAI/bge-small-en-v1.5

This is a SetFit model that can be used for Text Classification. This SetFit model uses BAAI/bge-small-en-v1.5 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
ORGANIZATIONAL
  • 'cryptonewton Shelby BitGet partner '
  • 'trezor Trezor Crypto security made easy'
  • 'forbes Forbes Sign up now for Forbes free daily newsletter for unmatched insights and exclusive reporting '
INDIVIDUAL
  • 'anbessa100 ANBESSA No paid service Never DM u'
  • 'sbf_ftx SBF '
  • 'machibigbrother Machi Big Brother '

Evaluation

Metrics

Label Accuracy
all 0.99

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("kasparas12/is_organizational_model")
# Run inference
preds = model("tradermayne Mayne ")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 3 15.7338 35
Label Training Sample Count
INDIVIDUAL 423
ORGANIZATIONAL 377

Training Hyperparameters

  • batch_size: (32, 32)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0016 1 0.2511 -
0.0789 50 0.2505 -
0.1577 100 0.2225 -
0.2366 150 0.2103 -
0.3155 200 0.1383 -
0.3943 250 0.0329 -
0.4732 300 0.0098 -
0.5521 350 0.0034 -
0.6309 400 0.0019 -
0.7098 450 0.0015 -
0.7886 500 0.0014 -
0.8675 550 0.0012 -
0.0001 1 0.2524 -
0.0050 50 0.2115 -
0.0099 100 0.193 -
0.0001 1 0.2424 -
0.0050 50 0.2038 -
0.0099 100 0.1782 -
0.0001 1 0.2208 -
0.0050 50 0.1931 -
0.0099 100 0.1629 -
0.0149 150 0.2716 -
0.0199 200 0.18 -
0.0249 250 0.2504 -
0.0298 300 0.1936 -
0.0348 350 0.1764 -
0.0398 400 0.1817 -
0.0447 450 0.0624 -
0.0497 500 0.1183 -
0.0547 550 0.0793 -
0.0596 600 0.0281 -
0.0646 650 0.0876 -
0.0696 700 0.1701 -
0.0746 750 0.0468 -
0.0795 800 0.0525 -
0.0845 850 0.0783 -
0.0895 900 0.0342 -
0.0944 950 0.0158 -
0.0994 1000 0.0286 -
0.1044 1050 0.0016 -
0.1094 1100 0.0014 -
0.1143 1150 0.0298 -
0.1193 1200 0.018 -
0.1243 1250 0.0299 -
0.1292 1300 0.0019 -
0.1342 1350 0.0253 -
0.1392 1400 0.0009 -
0.1441 1450 0.0009 -
0.1491 1500 0.0011 -
0.1541 1550 0.0006 -
0.1591 1600 0.0006 -
0.1640 1650 0.0008 -
0.1690 1700 0.0005 -
0.1740 1750 0.0007 -
0.1789 1800 0.0006 -
0.1839 1850 0.0006 -
0.1889 1900 0.0006 -
0.1939 1950 0.0012 -
0.1988 2000 0.0004 -
0.2038 2050 0.0006 -
0.2088 2100 0.0005 -
0.2137 2150 0.0005 -
0.2187 2200 0.0005 -
0.2237 2250 0.0004 -
0.2287 2300 0.0005 -
0.2336 2350 0.0004 -
0.2386 2400 0.0004 -
0.2436 2450 0.0003 -
0.2485 2500 0.0004 -
0.2535 2550 0.0004 -
0.2585 2600 0.0004 -
0.2634 2650 0.0004 -
0.2684 2700 0.0004 -
0.2734 2750 0.0004 -
0.2784 2800 0.0056 -
0.2833 2850 0.0004 -
0.2883 2900 0.0003 -
0.2933 2950 0.0003 -
0.2982 3000 0.0004 -
0.3032 3050 0.0003 -
0.3082 3100 0.0003 -
0.3132 3150 0.0003 -
0.3181 3200 0.0003 -
0.3231 3250 0.0004 -
0.3281 3300 0.0003 -
0.3330 3350 0.0003 -
0.3380 3400 0.0003 -
0.3430 3450 0.0003 -
0.3479 3500 0.0003 -
0.3529 3550 0.0003 -
0.3579 3600 0.0003 -
0.3629 3650 0.0003 -
0.3678 3700 0.0003 -
0.3728 3750 0.0004 -
0.3778 3800 0.0004 -
0.3827 3850 0.0003 -
0.3877 3900 0.0003 -
0.3927 3950 0.0003 -
0.3977 4000 0.0003 -
0.4026 4050 0.0003 -
0.4076 4100 0.0003 -
0.4126 4150 0.0003 -
0.4175 4200 0.0003 -
0.4225 4250 0.0003 -
0.4275 4300 0.0003 -
0.4324 4350 0.0003 -
0.4374 4400 0.0002 -
0.4424 4450 0.0003 -
0.4474 4500 0.0003 -
0.4523 4550 0.0003 -
0.4573 4600 0.0003 -
0.4623 4650 0.0003 -
0.4672 4700 0.0002 -
0.4722 4750 0.0002 -
0.4772 4800 0.0003 -
0.4822 4850 0.0002 -
0.4871 4900 0.0002 -
0.4921 4950 0.0002 -
0.4971 5000 0.0003 -
0.5020 5050 0.0003 -
0.5070 5100 0.0002 -
0.5120 5150 0.0003 -
0.5169 5200 0.0002 -
0.5219 5250 0.0002 -
0.5269 5300 0.0002 -
0.5319 5350 0.0002 -
0.5368 5400 0.0003 -
0.5418 5450 0.0002 -
0.5468 5500 0.0002 -
0.5517 5550 0.0002 -
0.5567 5600 0.0002 -
0.5617 5650 0.0002 -
0.5667 5700 0.0002 -
0.5716 5750 0.0002 -
0.5766 5800 0.0002 -
0.5816 5850 0.0002 -
0.5865 5900 0.0002 -
0.5915 5950 0.0002 -
0.5965 6000 0.0002 -
0.6015 6050 0.0002 -
0.6064 6100 0.0002 -
0.6114 6150 0.0002 -
0.6164 6200 0.0002 -
0.6213 6250 0.0002 -
0.6263 6300 0.0002 -
0.6313 6350 0.0002 -
0.6362 6400 0.0002 -
0.6412 6450 0.0002 -
0.6462 6500 0.0002 -
0.6512 6550 0.0002 -
0.6561 6600 0.0002 -
0.6611 6650 0.0002 -
0.6661 6700 0.0002 -
0.6710 6750 0.0002 -
0.6760 6800 0.0002 -
0.6810 6850 0.0002 -
0.6860 6900 0.0002 -
0.6909 6950 0.0002 -
0.6959 7000 0.0002 -
0.7009 7050 0.0002 -
0.7058 7100 0.0002 -
0.7108 7150 0.0002 -
0.7158 7200 0.0002 -
0.7207 7250 0.0002 -
0.7257 7300 0.0002 -
0.7307 7350 0.0002 -
0.7357 7400 0.0002 -
0.7406 7450 0.0002 -
0.7456 7500 0.0002 -
0.7506 7550 0.0002 -
0.7555 7600 0.0002 -
0.7605 7650 0.0002 -
0.7655 7700 0.0248 -
0.7705 7750 0.0002 -
0.7754 7800 0.0002 -
0.7804 7850 0.0002 -
0.7854 7900 0.0002 -
0.7903 7950 0.0002 -
0.7953 8000 0.0002 -
0.8003 8050 0.0002 -
0.8052 8100 0.0002 -
0.8102 8150 0.0002 -
0.8152 8200 0.0002 -
0.8202 8250 0.0002 -
0.8251 8300 0.0002 -
0.8301 8350 0.0002 -
0.8351 8400 0.0002 -
0.8400 8450 0.0001 -
0.8450 8500 0.0002 -
0.8500 8550 0.0002 -
0.8550 8600 0.0001 -
0.8599 8650 0.0002 -
0.8649 8700 0.0002 -
0.8699 8750 0.0002 -
0.8748 8800 0.0002 -
0.8798 8850 0.0002 -
0.8848 8900 0.0002 -
0.8898 8950 0.0003 -
0.8947 9000 0.0002 -
0.8997 9050 0.0001 -
0.9047 9100 0.0002 -
0.9096 9150 0.0002 -
0.9146 9200 0.0002 -
0.9196 9250 0.0002 -
0.9245 9300 0.0002 -
0.9295 9350 0.0002 -
0.9345 9400 0.0002 -
0.9395 9450 0.0002 -
0.9444 9500 0.0002 -
0.9494 9550 0.0001 -
0.9544 9600 0.0001 -
0.9593 9650 0.0002 -
0.9643 9700 0.0002 -
0.9693 9750 0.0002 -
0.9743 9800 0.0001 -
0.9792 9850 0.0002 -
0.9842 9900 0.0002 -
0.9892 9950 0.0002 -
0.9941 10000 0.0002 -
0.9991 10050 0.0002 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 2.3.1
  • Transformers: 4.35.2
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.17.0
  • Tokenizers: 0.15.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
11
Safetensors
Model size
33.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for kasparas12/is_organizational_model

Finetuned
(107)
this model

Evaluation results