Edit model card

SetFit with sentence-transformers/all-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
0
  • 'We in the United States believe if we can promote democracy around the world, there will be more peace.'
  • 'We recognise the transformative power of technology, including digital public infrastructure, to support sustainable development in the Indo-Pacific and deliver economic and social benefits.'
  • 'This program strengthens democracy, transparency, and the rule of law in developing nations, and I ask you to fully fund this important initiative.'
1
  • 'I do not ever want to ever fight a war that is unconstitutional and I am the dangerous person.'
  • "And so, we are at a moment where I really think threats to our democracy, threats to our core freedoms are very much on people's minds."
  • 'My views in opposition to the cancellation of the war debt are a matter of detailed record in many public statements and in a recent message to the Congress.'

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("setfit_model_id")
# Run inference
preds = model("We cannot allow the world's leading sponsor of terrorism to possess the planet's most dangerous weapons.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 3 23.4393 46
Label Training Sample Count
0 486
1 486

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (1.003444469523018e-06, 1.003444469523018e-06)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 37
  • eval_max_steps: -1
  • load_best_model_at_end: True

Training Results

Epoch Step Training Loss Validation Loss
0.0000 1 0.3295 -
0.0017 50 0.3132 -
0.0034 100 0.274 -
0.0051 150 0.2774 -
0.0068 200 0.2578 -
0.0084 250 0.2536 -
0.0101 300 0.3353 -
0.0118 350 0.253 -
0.0135 400 0.2865 -
0.0152 450 0.2894 -
0.0169 500 0.2554 0.2632
0.0186 550 0.2487 -
0.0203 600 0.2713 -
0.0220 650 0.2841 -
0.0237 700 0.2251 -
0.0253 750 0.2534 -
0.0270 800 0.2489 -
0.0287 850 0.2297 -
0.0304 900 0.2288 -
0.0321 950 0.211 -
0.0338 1000 0.188 0.2073
0.0355 1050 0.1488 -
0.0372 1100 0.2103 -
0.0389 1150 0.1607 -
0.0406 1200 0.0793 -
0.0422 1250 0.0968 -
0.0439 1300 0.0987 -
0.0456 1350 0.0786 -
0.0473 1400 0.0267 -
0.0490 1450 0.0432 -
0.0507 1500 0.0262 0.064
0.0524 1550 0.1269 -
0.0541 1600 0.039 -
0.0558 1650 0.0266 -
0.0575 1700 0.0455 -
0.0591 1750 0.0175 -
0.0608 1800 0.0157 -
0.0625 1850 0.0063 -
0.0642 1900 0.0146 -
0.0659 1950 0.0046 -
0.0676 2000 0.0046 0.0464
0.0693 2050 0.0035 -
0.0710 2100 0.0073 -
0.0727 2150 0.0012 -
0.0744 2200 0.0025 -
0.0760 2250 0.0023 -
0.0777 2300 0.0017 -
0.0794 2350 0.0012 -
0.0811 2400 0.0017 -
0.0828 2450 0.0016 -
0.0845 2500 0.0014 0.0535
0.0862 2550 0.0011 -
0.0879 2600 0.0021 -
0.0896 2650 0.0009 -
0.0913 2700 0.0008 -
0.0929 2750 0.0006 -
0.0946 2800 0.0007 -
0.0963 2850 0.0012 -
0.0980 2900 0.001 -
0.0997 2950 0.0005 -
0.1014 3000 0.0006 0.0575
0.1031 3050 0.0006 -
0.1048 3100 0.0004 -
0.1065 3150 0.0006 -
0.1082 3200 0.0005 -
0.1098 3250 0.0006 -
0.1115 3300 0.0005 -
0.1132 3350 0.0008 -
0.1149 3400 0.0003 -
0.1166 3450 0.0005 -
0.1183 3500 0.0004 0.0642
0.1200 3550 0.0006 -
0.1217 3600 0.0003 -
0.1234 3650 0.0009 -
0.1251 3700 0.0002 -
0.1267 3750 0.0003 -
0.1284 3800 0.0005 -
0.1301 3850 0.0002 -
0.1318 3900 0.0002 -
0.1335 3950 0.0005 -
0.1352 4000 0.0003 0.0697
0.1369 4050 0.0002 -
0.1386 4100 0.0002 -
0.1403 4150 0.0004 -
0.1420 4200 0.0012 -
0.1436 4250 0.0002 -
0.1453 4300 0.0002 -
0.1470 4350 0.0001 -
0.1487 4400 0.0002 -
0.1504 4450 0.0002 -
0.1521 4500 0.0003 0.0718
0.1538 4550 0.0003 -
0.1555 4600 0.0002 -
0.1572 4650 0.0002 -
0.1589 4700 0.0003 -
0.1605 4750 0.0002 -
0.1622 4800 0.0002 -
0.1639 4850 0.0002 -
0.1656 4900 0.0002 -
0.1673 4950 0.0002 -
0.1690 5000 0.0002 0.0684
0.1707 5050 0.0002 -
0.1724 5100 0.0002 -
0.1741 5150 0.0002 -
0.1758 5200 0.0003 -
0.1774 5250 0.0002 -
0.1791 5300 0.0001 -
0.1808 5350 0.0002 -
0.1825 5400 0.0001 -
0.1842 5450 0.0001 -
0.1859 5500 0.0001 0.0731
0.1876 5550 0.0002 -
0.1893 5600 0.0002 -
0.1910 5650 0.0001 -
0.1927 5700 0.0001 -
0.1943 5750 0.0001 -
0.1960 5800 0.0002 -
0.1977 5850 0.0001 -
0.1994 5900 0.0003 -
0.2011 5950 0.0002 -
0.2028 6000 0.0002 0.0724
0.2045 6050 0.0001 -
0.2062 6100 0.0001 -
0.2079 6150 0.0001 -
0.2096 6200 0.0001 -
0.2112 6250 0.0001 -
0.2129 6300 0.0002 -
0.2146 6350 0.0001 -
0.2163 6400 0.0001 -
0.2180 6450 0.0001 -
0.2197 6500 0.0001 0.0784
0.2214 6550 0.0001 -
0.2231 6600 0.0001 -
0.2248 6650 0.0001 -
0.2265 6700 0.0001 -
0.2281 6750 0.0001 -
0.2298 6800 0.0001 -
0.2315 6850 0.0001 -
0.2332 6900 0.0001 -
0.2349 6950 0.0002 -
0.2366 7000 0.0001 0.0672
0.2383 7050 0.0001 -
0.2400 7100 0.0001 -
0.2417 7150 0.0001 -
0.2434 7200 0.0001 -
0.2450 7250 0.0001 -
0.2467 7300 0.0001 -
0.2484 7350 0.0001 -
0.2501 7400 0.0001 -
0.2518 7450 0.0001 -
0.2535 7500 0.0001 0.0627
0.2552 7550 0.0001 -
0.2569 7600 0.0001 -
0.2586 7650 0.0 -
0.2603 7700 0.0001 -
0.2619 7750 0.0 -
0.2636 7800 0.0001 -
0.2653 7850 0.0001 -
0.2670 7900 0.0001 -
0.2687 7950 0.0001 -
0.2704 8000 0.0 0.0754
0.2721 8050 0.0001 -
0.2738 8100 0.0001 -
0.2755 8150 0.0 -
0.2772 8200 0.0 -
0.2788 8250 0.0 -
0.2805 8300 0.0001 -
0.2822 8350 0.0001 -
0.2839 8400 0.0001 -
0.2856 8450 0.0 -
0.2873 8500 0.0 0.0748
0.2890 8550 0.0 -
0.2907 8600 0.0 -
0.2924 8650 0.0 -
0.2941 8700 0.0 -
0.2957 8750 0.0001 -
0.2974 8800 0.0001 -
0.2991 8850 0.0001 -
0.3008 8900 0.0 -
0.3025 8950 0.0001 -
0.3042 9000 0.0001 0.057
0.3059 9050 0.0 -
0.3076 9100 0.0 -
0.3093 9150 0.0002 -
0.3110 9200 0.0 -
0.3126 9250 0.0 -
0.3143 9300 0.0 -
0.3160 9350 0.0001 -
0.3177 9400 0.0002 -
0.3194 9450 0.0 -
0.3211 9500 0.0 0.0781
0.3228 9550 0.0 -
0.3245 9600 0.0 -
0.3262 9650 0.0 -
0.3279 9700 0.0 -
0.3295 9750 0.0 -
0.3312 9800 0.0 -
0.3329 9850 0.0 -
0.3346 9900 0.0001 -
0.3363 9950 0.0 -
0.3380 10000 0.0 0.0698
0.3397 10050 0.0 -
0.3414 10100 0.0 -
0.3431 10150 0.0 -
0.3448 10200 0.0 -
0.3464 10250 0.0022 -
0.3481 10300 0.0 -
0.3498 10350 0.0001 -
0.3515 10400 0.0 -
0.3532 10450 0.0 -
0.3549 10500 0.0 0.0698
0.3566 10550 0.0 -
0.3583 10600 0.0 -
0.3600 10650 0.0 -
0.3617 10700 0.0 -
0.3633 10750 0.0 -
0.3650 10800 0.0 -
0.3667 10850 0.0 -
0.3684 10900 0.0001 -
0.3701 10950 0.0 -
0.3718 11000 0.0 0.0746
0.3735 11050 0.0 -
0.3752 11100 0.0 -
0.3769 11150 0.0001 -
0.3786 11200 0.0 -
0.3802 11250 0.0 -
0.3819 11300 0.0 -
0.3836 11350 0.0 -
0.3853 11400 0.0 -
0.3870 11450 0.0 -
0.3887 11500 0.0 0.0753
0.3904 11550 0.0 -
0.3921 11600 0.0001 -
0.3938 11650 0.0 -
0.3955 11700 0.0 -
0.3971 11750 0.0 -
0.3988 11800 0.0 -
0.4005 11850 0.0 -
0.4022 11900 0.0 -
0.4039 11950 0.0 -
0.4056 12000 0.0 0.0743
0.4073 12050 0.0 -
0.4090 12100 0.0 -
0.4107 12150 0.0 -
0.4124 12200 0.0 -
0.4140 12250 0.0 -
0.4157 12300 0.0 -
0.4174 12350 0.0 -
0.4191 12400 0.0 -
0.4208 12450 0.0 -
0.4225 12500 0.0 0.0733
0.4242 12550 0.0 -
0.4259 12600 0.0 -
0.4276 12650 0.0 -
0.4293 12700 0.0 -
0.4309 12750 0.0 -
0.4326 12800 0.0 -
0.4343 12850 0.0 -
0.4360 12900 0.0 -
0.4377 12950 0.0 -
0.4394 13000 0.0 0.072
0.4411 13050 0.0 -
0.4428 13100 0.0 -
0.4445 13150 0.0 -
0.4462 13200 0.0 -
0.4478 13250 0.0 -
0.4495 13300 0.0 -
0.4512 13350 0.0 -
0.4529 13400 0.0 -
0.4546 13450 0.0 -
0.4563 13500 0.0 0.0753
0.4580 13550 0.0 -
0.4597 13600 0.0 -
0.4614 13650 0.0 -
0.4631 13700 0.0 -
0.4647 13750 0.0 -
0.4664 13800 0.0 -
0.4681 13850 0.0 -
0.4698 13900 0.0 -
0.4715 13950 0.0 -
0.4732 14000 0.0 0.0756
0.4749 14050 0.0 -
0.4766 14100 0.0 -
0.4783 14150 0.0 -
0.4800 14200 0.0 -
0.4816 14250 0.0 -
0.4833 14300 0.0 -
0.4850 14350 0.0 -
0.4867 14400 0.0 -
0.4884 14450 0.0 -
0.4901 14500 0.0 0.0622
0.4918 14550 0.0 -
0.4935 14600 0.0 -
0.4952 14650 0.0 -
0.4969 14700 0.0 -
0.4985 14750 0.0 -
0.5002 14800 0.0 -
0.5019 14850 0.0 -
0.5036 14900 0.0 -
0.5053 14950 0.0 -
0.5070 15000 0.0 0.0676
0.5087 15050 0.0 -
0.5104 15100 0.0 -
0.5121 15150 0.0 -
0.5138 15200 0.0 -
0.5154 15250 0.0 -
0.5171 15300 0.0 -
0.5188 15350 0.0 -
0.5205 15400 0.0 -
0.5222 15450 0.0 -
0.5239 15500 0.0 0.0668
0.5256 15550 0.0 -
0.5273 15600 0.0 -
0.5290 15650 0.0 -
0.5307 15700 0.0 -
0.5323 15750 0.0 -
0.5340 15800 0.0 -
0.5357 15850 0.0 -
0.5374 15900 0.0 -
0.5391 15950 0.0 -
0.5408 16000 0.0 0.0707
0.5425 16050 0.0 -
0.5442 16100 0.0 -
0.5459 16150 0.0 -
0.5476 16200 0.0 -
0.5492 16250 0.0 -
0.5509 16300 0.0 -
0.5526 16350 0.0 -
0.5543 16400 0.0 -
0.5560 16450 0.0 -
0.5577 16500 0.0 0.0644
0.5594 16550 0.0 -
0.5611 16600 0.0 -
0.5628 16650 0.0 -
0.5645 16700 0.0 -
0.5661 16750 0.0 -
0.5678 16800 0.0 -
0.5695 16850 0.0 -
0.5712 16900 0.0 -
0.5729 16950 0.0 -
0.5746 17000 0.0 0.0742
0.5763 17050 0.0 -
0.5780 17100 0.0 -
0.5797 17150 0.0 -
0.5814 17200 0.0 -
0.5830 17250 0.0 -
0.5847 17300 0.0 -
0.5864 17350 0.0 -
0.5881 17400 0.0 -
0.5898 17450 0.0 -
0.5915 17500 0.0 0.0738
0.5932 17550 0.0 -
0.5949 17600 0.0 -
0.5966 17650 0.0 -
0.5983 17700 0.0 -
0.5999 17750 0.0 -
0.6016 17800 0.0 -
0.6033 17850 0.0 -
0.6050 17900 0.0 -
0.6067 17950 0.0 -
0.6084 18000 0.0 0.0725
0.6101 18050 0.0 -
0.6118 18100 0.0 -
0.6135 18150 0.0 -
0.6152 18200 0.0 -
0.6168 18250 0.0 -
0.6185 18300 0.0 -
0.6202 18350 0.0 -
0.6219 18400 0.0 -
0.6236 18450 0.0 -
0.6253 18500 0.0 0.0724
0.6270 18550 0.0 -
0.6287 18600 0.0 -
0.6304 18650 0.0 -
0.6321 18700 0.0 -
0.6337 18750 0.0 -
0.6354 18800 0.0 -
0.6371 18850 0.0 -
0.6388 18900 0.0 -
0.6405 18950 0.0 -
0.6422 19000 0.0 0.0622
0.6439 19050 0.0 -
0.6456 19100 0.0 -
0.6473 19150 0.0 -
0.6490 19200 0.0 -
0.6506 19250 0.0 -
0.6523 19300 0.0 -
0.6540 19350 0.0 -
0.6557 19400 0.0 -
0.6574 19450 0.0 -
0.6591 19500 0.0 0.0754
0.6608 19550 0.0 -
0.6625 19600 0.0 -
0.6642 19650 0.0 -
0.6659 19700 0.0 -
0.6675 19750 0.0 -
0.6692 19800 0.0 -
0.6709 19850 0.0 -
0.6726 19900 0.0 -
0.6743 19950 0.0 -
0.6760 20000 0.0 0.0723
0.6777 20050 0.0 -
0.6794 20100 0.0 -
0.6811 20150 0.0 -
0.6828 20200 0.0 -
0.6844 20250 0.0 -
0.6861 20300 0.0 -
0.6878 20350 0.0 -
0.6895 20400 0.0 -
0.6912 20450 0.0 -
0.6929 20500 0.0 0.0741
0.6946 20550 0.0 -
0.6963 20600 0.0 -
0.6980 20650 0.0 -
0.6997 20700 0.0 -
0.7013 20750 0.0 -
0.7030 20800 0.0 -
0.7047 20850 0.0 -
0.7064 20900 0.0 -
0.7081 20950 0.0 -
0.7098 21000 0.0 0.0733
0.7115 21050 0.0 -
0.7132 21100 0.0 -
0.7149 21150 0.0 -
0.7166 21200 0.0 -
0.7182 21250 0.0 -
0.7199 21300 0.0 -
0.7216 21350 0.0 -
0.7233 21400 0.0 -
0.7250 21450 0.0 -
0.7267 21500 0.0 0.0757
0.7284 21550 0.0 -
0.7301 21600 0.0 -
0.7318 21650 0.0 -
0.7335 21700 0.0 -
0.7351 21750 0.0 -
0.7368 21800 0.0 -
0.7385 21850 0.0 -
0.7402 21900 0.0 -
0.7419 21950 0.0 -
0.7436 22000 0.0 0.0766
0.7453 22050 0.0 -
0.7470 22100 0.0 -
0.7487 22150 0.0 -
0.7504 22200 0.0 -
0.7520 22250 0.0 -
0.7537 22300 0.0 -
0.7554 22350 0.0 -
0.7571 22400 0.0 -
0.7588 22450 0.0 -
0.7605 22500 0.0 0.0757
0.7622 22550 0.0 -
0.7639 22600 0.0 -
0.7656 22650 0.0 -
0.7673 22700 0.0 -
0.7689 22750 0.0 -
0.7706 22800 0.0 -
0.7723 22850 0.0 -
0.7740 22900 0.0 -
0.7757 22950 0.0 -
0.7774 23000 0.0 0.0755
0.7791 23050 0.0 -
0.7808 23100 0.0 -
0.7825 23150 0.0 -
0.7842 23200 0.0 -
0.7858 23250 0.0 -
0.7875 23300 0.0 -
0.7892 23350 0.0 -
0.7909 23400 0.0 -
0.7926 23450 0.0 -
0.7943 23500 0.0 0.076
0.7960 23550 0.0 -
0.7977 23600 0.0 -
0.7994 23650 0.0 -
0.8011 23700 0.0 -
0.8027 23750 0.0 -
0.8044 23800 0.0 -
0.8061 23850 0.0 -
0.8078 23900 0.0 -
0.8095 23950 0.0 -
0.8112 24000 0.0 0.0756
0.8129 24050 0.0 -
0.8146 24100 0.0 -
0.8163 24150 0.0 -
0.8180 24200 0.0 -
0.8196 24250 0.0 -
0.8213 24300 0.0 -
0.8230 24350 0.0 -
0.8247 24400 0.0 -
0.8264 24450 0.0 -
0.8281 24500 0.0 0.0759
0.8298 24550 0.0 -
0.8315 24600 0.0 -
0.8332 24650 0.0 -
0.8349 24700 0.0 -
0.8365 24750 0.0 -
0.8382 24800 0.0 -
0.8399 24850 0.0 -
0.8416 24900 0.0 -
0.8433 24950 0.0 -
0.8450 25000 0.0 0.0762
0.8467 25050 0.0 -
0.8484 25100 0.0 -
0.8501 25150 0.0 -
0.8518 25200 0.0 -
0.8534 25250 0.0 -
0.8551 25300 0.0 -
0.8568 25350 0.0 -
0.8585 25400 0.0 -
0.8602 25450 0.0 -
0.8619 25500 0.0 0.0733
0.8636 25550 0.0 -
0.8653 25600 0.0 -
0.8670 25650 0.0 -
0.8687 25700 0.0 -
0.8703 25750 0.0 -
0.8720 25800 0.0 -
0.8737 25850 0.0 -
0.8754 25900 0.0 -
0.8771 25950 0.0 -
0.8788 26000 0.0 0.0742
0.8805 26050 0.0 -
0.8822 26100 0.0 -
0.8839 26150 0.0 -
0.8856 26200 0.0 -
0.8872 26250 0.0 -
0.8889 26300 0.0 -
0.8906 26350 0.0 -
0.8923 26400 0.0 -
0.8940 26450 0.0 -
0.8957 26500 0.0 0.0756
0.8974 26550 0.0 -
0.8991 26600 0.0 -
0.9008 26650 0.0 -
0.9025 26700 0.0 -
0.9041 26750 0.0 -
0.9058 26800 0.0 -
0.9075 26850 0.0 -
0.9092 26900 0.0 -
0.9109 26950 0.0 -
0.9126 27000 0.0 0.0751
0.9143 27050 0.0 -
0.9160 27100 0.0 -
0.9177 27150 0.0 -
0.9194 27200 0.0 -
0.9210 27250 0.0 -
0.9227 27300 0.0 -
0.9244 27350 0.0 -
0.9261 27400 0.0 -
0.9278 27450 0.0 -
0.9295 27500 0.0 0.075
0.9312 27550 0.0 -
0.9329 27600 0.0 -
0.9346 27650 0.0 -
0.9363 27700 0.0 -
0.9379 27750 0.0 -
0.9396 27800 0.0 -
0.9413 27850 0.0 -
0.9430 27900 0.0 -
0.9447 27950 0.0 -
0.9464 28000 0.0 0.0725
0.9481 28050 0.0 -
0.9498 28100 0.0 -
0.9515 28150 0.0 -
0.9532 28200 0.0 -
0.9548 28250 0.0 -
0.9565 28300 0.0 -
0.9582 28350 0.0 -
0.9599 28400 0.0 -
0.9616 28450 0.0 -
0.9633 28500 0.0 0.0761
0.9650 28550 0.0 -
0.9667 28600 0.0 -
0.9684 28650 0.0 -
0.9701 28700 0.0 -
0.9717 28750 0.0 -
0.9734 28800 0.0 -
0.9751 28850 0.0 -
0.9768 28900 0.0 -
0.9785 28950 0.0 -
0.9802 29000 0.0 0.0759
0.9819 29050 0.0 -
0.9836 29100 0.0 -
0.9853 29150 0.0 -
0.9870 29200 0.0 -
0.9886 29250 0.0 -
0.9903 29300 0.0 -
0.9920 29350 0.0 -
0.9937 29400 0.0 -
0.9954 29450 0.0 -
0.9971 29500 0.0 0.0761
0.9988 29550 0.0 -
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.11
  • SetFit: 1.0.1
  • Sentence Transformers: 2.2.2
  • Transformers: 4.25.1
  • PyTorch: 2.1.2
  • Datasets: 2.15.0
  • Tokenizers: 0.13.3

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for pacoreyes/StanceFit

Finetuned
(157)
this model