Edit model card

SetFit with BAAI/bge-small-en-v1.5

This is a SetFit model that can be used for Text Classification. This SetFit model uses BAAI/bge-small-en-v1.5 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
left
  • 'Tennessee has an annual sales tax-free holiday weekend that\xa0begins\xa0on the last Friday of July.\xa0'
  • 'In what could be construed as an act of treason,\xa0President Trump recently ordered such\xa0paramilitary groups and right-wing thugs\xa0to take up arms and to threaten Democratic-led state governments such as Michigan's in order to force them to "reopen" their state.'
  • 'Trump, not surprisingly, used the speech as an opportunity to attack former President Barack Obama, claiming that he did nothing to promote criminal justice reform when he was in office.\xa0'
right
  • 'In the Joe Biden-Bernie Sanders “Unity” platform, Democrats are vowing to provide free, American taxpayer-funded health care to illegal aliens who are able to enroll in former President Obama’s Deferred Action for Childhood Arrivals (DACA) program.'
  • 'The new numbers from Gallup are an unwelcome sight for Democrats after kicking off the week with a disaster caucus in Iowa who and simultaneously anticipating a Trump acquittal in the Senate. Trump will also now have the opportunity to shine in his newfound approval in Tuesday night’s address to the nation while Democrats are in disarray.'
  • 'Though Trump has successfully increased wages by four percent over the last 12 months for America’s blue collar and working class by decreasing foreign competition through a crackdown on illegal immigration, experts have warned that those wage hikes will not continue heading into the 2020 election should current illegal immigration levels keep rising at record levels.'
center
  • 'LeBron James shares thoughts on his Los Angeles house getting vandalized pic twitter com 4RFLK42xhu'
  • 'O’Rourke, a native of the U.S.-Mexican border town El Paso, has blasted Trump’s use of tariffs as a “huge mistake” and has vowed to suspend them on his first day in office.'
  • 'Here are a few people we will be reminding you about in every article that pertains to a film they re tied to '

Evaluation

Metrics

Label Accuracy Precision Recall F1
all 0.7010 0.7024 0.7010 0.7016

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("JordanTallon/Unifeed")
# Run inference
preds = model("A Black man, Floyd died in police custody May 25 after a Minneapolis cop kneeled on his neck for more than eight minutes.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 9 32.9560 90
Label Training Sample Count
center 777
left 780
right 808

Training Hyperparameters

  • batch_size: (32, 32)
  • num_epochs: (200, 200)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 1
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: True
  • warmup_proportion: 0.1
  • seed: 326
  • run_name: unifeed_bias_training
  • eval_max_steps: -1
  • load_best_model_at_end: True

Training Results

Epoch Step Training Loss Validation Loss
0.0002 1 0.2486 -
1.0 4878 0.0092 0.308
2.0 9756 0.0004 0.3228
3.0 14634 0.0002 0.3326
4.0 19512 0.0002 0.3191
5.0 24390 0.0001 0.3279
6.0 29268 0.0001 0.3384
7.0 34146 0.0001 0.3311
8.0 39024 0.0001 0.3316
0.0068 1 0.0007 -
1.0 148 0.0006 0.3042
2.0 296 0.0006 0.3352
3.0 444 0.0382 0.3059
4.0 592 0.0022 0.3055
5.0 740 0.0044 0.3034
6.0 888 0.0006 0.3185
7.0 1036 0.0005 0.3066
8.0 1184 0.0008 0.3196
9.0 1332 0.0004 0.326
10.0 1480 0.0004 0.352
11.0 1628 0.0005 0.3122
12.0 1776 0.0003 0.3268
13.0 1924 0.0004 0.2928
14.0 2072 0.0004 0.3148
15.0 2220 0.0003 0.3153
16.0 2368 0.0004 0.3385
17.0 2516 0.0004 0.3107
18.0 2664 0.0004 0.3225
19.0 2812 0.0003 0.3073
20.0 2960 0.0003 0.316
21.0 3108 0.0003 0.3053
22.0 3256 0.0004 0.3227
23.0 3404 0.0004 0.3099
24.0 3552 0.0003 0.3043
25.0 3700 0.0003 0.3316
0.0034 1 0.0004 -
1.0 296 0.0003 0.3321
2.0 592 0.0016 0.3202
3.0 888 0.0005 0.3376
4.0 1184 0.0004 0.3167
5.0 1480 0.0003 0.3342
6.0 1776 0.0003 0.3183
7.0 2072 0.0003 0.3086
8.0 2368 0.0003 0.312
9.0 2664 0.0003 0.3169
10.0 2960 0.0003 0.3317
11.0 3256 0.0004 0.3126
12.0 3552 0.0003 0.3003
13.0 3848 0.0003 0.3119
14.0 4144 0.0003 0.316
15.0 4440 0.0002 0.3183
16.0 4736 0.0003 0.313
17.0 5032 0.0003 0.3187
18.0 5328 0.0002 0.3295
19.0 5624 0.0002 0.3487
20.0 5920 0.0003 0.3458
21.0 6216 0.0002 0.331
22.0 6512 0.0002 0.3499
23.0 6808 0.0003 0.3296
24.0 7104 0.0003 0.3097
25.0 7400 0.0003 0.3197
0.0068 1 0.0003 -
1.0 148 0.0003 0.3219
2.0 296 0.0003 0.3185
3.0 444 0.0003 0.3114
4.0 592 0.0003 0.2989
5.0 740 0.0003 0.335
6.0 888 0.0004 0.3132
7.0 1036 0.0003 0.3264
8.0 1184 0.0004 0.3461
9.0 1332 0.0002 0.3185
10.0 1480 0.0002 0.3336
11.0 1628 0.0003 0.3282
12.0 1776 0.0003 0.3206
13.0 1924 0.0002 0.3303
14.0 2072 0.0002 0.3362
15.0 2220 0.0002 0.3382
16.0 2368 0.0002 0.3241
17.0 2516 0.0002 0.3303
18.0 2664 0.0002 0.3301
19.0 2812 0.0002 0.319
20.0 2960 0.0002 0.3304
21.0 3108 0.0002 0.3379
22.0 3256 0.0002 0.3424
23.0 3404 0.0002 0.3273
24.0 3552 0.0002 0.3213
25.0 3700 0.0002 0.3191
0.0068 1 0.0003 -
1.0 148 0.0003 0.3245
2.0 296 0.0002 0.3148
3.0 444 0.0002 0.3174
4.0 592 0.0003 0.3242
5.0 740 0.0003 0.3352
6.0 888 0.0003 0.3112
7.0 1036 0.0003 0.3204
8.0 1184 0.0003 0.3734
9.0 1332 0.0002 0.3383
10.0 1480 0.0003 0.3272
11.0 1628 0.0002 0.3106
12.0 1776 0.0003 0.3307
13.0 1924 0.0003 0.3359
14.0 2072 0.0002 0.3264
15.0 2220 0.0002 0.3254
16.0 2368 0.0002 0.3349
17.0 2516 0.0132 0.3399
18.0 2664 0.0002 0.343
19.0 2812 0.0002 0.3306
20.0 2960 0.0002 0.3472
21.0 3108 0.0002 0.3234
22.0 3256 0.002 0.3281
23.0 3404 0.0002 0.3289
24.0 3552 0.0002 0.2974
25.0 3700 0.0002 0.3153
26.0 3848 0.0002 0.3273
27.0 3996 0.0002 0.313
28.0 4144 0.0002 0.3303
29.0 4292 0.0002 0.3106
30.0 4440 0.0002 0.3155
31.0 4588 0.0002 0.3553
32.0 4736 0.0002 0.3039
33.0 4884 0.0001 0.3133
34.0 5032 0.0002 0.3323
35.0 5180 0.0002 0.3264
36.0 5328 0.0002 0.3133
37.0 5476 0.0002 0.3308
38.0 5624 0.0002 0.3137
39.0 5772 0.0002 0.3062
40.0 5920 0.0002 0.3438
41.0 6068 0.0002 0.3426
42.0 6216 0.0002 0.326
43.0 6364 0.0002 0.322
44.0 6512 0.0002 0.3202
45.0 6660 0.0002 0.3253
46.0 6808 0.0002 0.3272
47.0 6956 0.0002 0.3258
48.0 7104 0.0002 0.3252
49.0 7252 0.0002 0.3233
50.0 7400 0.0002 0.3234
0.0135 1 0.0002 -
1.0 74 0.0002 -
0.0068 1 0.0002 -
1.0 148 0.0002 0.3036
2.0 296 0.0002 0.3555
3.0 444 0.0002 0.3331
4.0 592 0.0002 0.3086
5.0 740 0.0002 0.3036
6.0 888 0.0002 0.3217
7.0 1036 0.0002 0.3416
8.0 1184 0.0002 0.3309
9.0 1332 0.0002 0.3424
10.0 1480 0.0003 0.3655
11.0 1628 0.0002 0.3042
12.0 1776 0.0019 0.326
13.0 1924 0.0002 0.3161
14.0 2072 0.0002 0.3286
15.0 2220 0.0002 0.3563
16.0 2368 0.0002 0.326
17.0 2516 0.0002 0.3114
18.0 2664 0.0002 0.3366
19.0 2812 0.0002 0.329
20.0 2960 0.0002 0.3217
21.0 3108 0.0002 0.325
22.0 3256 0.0002 0.3243
23.0 3404 0.0002 0.3341
24.0 3552 0.0002 0.3237
25.0 3700 0.0002 0.3433
26.0 3848 0.0002 0.3196
27.0 3996 0.0001 0.3372
28.0 4144 0.0001 0.3191
29.0 4292 0.0001 0.328
30.0 4440 0.0002 0.3416
31.0 4588 0.0002 0.3132
32.0 4736 0.0002 0.3429
33.0 4884 0.0002 0.336
34.0 5032 0.0002 0.3507
35.0 5180 0.0001 0.3483
36.0 5328 0.0002 0.3325
37.0 5476 0.0001 0.3406
38.0 5624 0.0003 0.3538
39.0 5772 0.0002 0.3422
40.0 5920 0.0002 0.3359
41.0 6068 0.0002 0.3252
42.0 6216 0.0002 0.326
43.0 6364 0.0002 0.3613
44.0 6512 0.0001 0.332
45.0 6660 0.0002 0.3295
46.0 6808 0.0002 0.3265
47.0 6956 0.0002 0.2982
48.0 7104 0.0002 0.3017
49.0 7252 0.0001 0.309
50.0 7400 0.0001 0.3199
51.0 7548 0.0001 0.325
52.0 7696 0.0002 0.3222
53.0 7844 0.0001 0.3189
54.0 7992 0.0001 0.3329
55.0 8140 0.0001 0.3272
56.0 8288 0.0001 0.3292
57.0 8436 0.0001 0.3283
58.0 8584 0.0001 0.3301
59.0 8732 0.0001 0.3334
60.0 8880 0.0001 0.3144
61.0 9028 0.0002 0.3487
62.0 9176 0.0002 0.3602
63.0 9324 0.0001 0.3056
64.0 9472 0.0001 0.3415
65.0 9620 0.0002 0.3299
66.0 9768 0.0001 0.3254
67.0 9916 0.0001 0.3396
68.0 10064 0.0001 0.3501
69.0 10212 0.0001 0.3275
70.0 10360 0.0001 0.34
71.0 10508 0.0001 0.3351
72.0 10656 0.0001 0.3367
73.0 10804 0.0001 0.3548
74.0 10952 0.0001 0.33
75.0 11100 0.0001 0.3259
76.0 11248 0.0002 0.3283
77.0 11396 0.0001 0.3214
78.0 11544 0.0001 0.324
79.0 11692 0.0001 0.3247
80.0 11840 0.0001 0.3347
81.0 11988 0.0001 0.3292
82.0 12136 0.0002 0.3568
83.0 12284 0.0001 0.324
84.0 12432 0.0001 0.3245
85.0 12580 0.0001 0.3368
86.0 12728 0.0001 0.3372
87.0 12876 0.0001 0.3432
88.0 13024 0.0001 0.3048
89.0 13172 0.0001 0.3395
90.0 13320 0.0001 0.3204
91.0 13468 0.0001 0.3122
92.0 13616 0.0001 0.3372
93.0 13764 0.0001 0.3306
94.0 13912 0.0001 0.3362
95.0 14060 0.0001 0.3386
96.0 14208 0.0001 0.3198
97.0 14356 0.0001 0.3176
98.0 14504 0.0001 0.3604
99.0 14652 0.0001 0.3507
100.0 14800 0.0001 0.3272
0.0023 1 0.0001 -
1.0 444 0.0002 0.3295
2.0 888 0.0001 0.3144
3.0 1332 0.0001 0.3213
4.0 1776 0.0001 0.3362
5.0 2220 0.0001 0.3398
6.0 2664 0.0001 0.3385
7.0 3108 0.0002 0.3406
8.0 3552 0.0001 0.3253
9.0 3996 0.0001 0.3253
10.0 4440 0.0001 0.3119
11.0 4884 0.0001 0.3204
12.0 5328 0.0001 0.3387
13.0 5772 0.0001 0.3387
14.0 6216 0.0001 0.3584
15.0 6660 0.0001 0.3548
16.0 7104 0.0001 0.3314
17.0 7548 0.0001 0.3335
18.0 7992 0.0001 0.3325
19.0 8436 0.0001 0.3545
20.0 8880 0.0001 0.3456
21.0 9324 0.0001 0.3532
22.0 9768 0.0001 0.3524
23.0 10212 0.0001 0.352
24.0 10656 0.0001 0.3502
25.0 11100 0.0 0.3275
0.0034 1 0.0001 -
1.0 296 0.0001 0.3209
2.0 592 0.0001 0.3265
3.0 888 0.0001 0.3414
4.0 1184 0.0001 0.3314
5.0 1480 0.0002 0.3498
6.0 1776 0.0001 0.337
7.0 2072 0.0001 0.3347
8.0 2368 0.0001 0.3494
9.0 2664 0.0001 0.3326
10.0 2960 0.0001 0.3259
11.0 3256 0.0002 0.3443
12.0 3552 0.0001 0.3431
13.0 3848 0.0001 0.324
14.0 4144 0.0001 0.3339
15.0 4440 0.0001 0.3255
16.0 4736 0.0001 0.3379
17.0 5032 0.0001 0.3285
18.0 5328 0.0001 0.3362
19.0 5624 0.0001 0.3319
20.0 5920 0.0001 0.3456
21.0 6216 0.0001 0.329
22.0 6512 0.0001 0.3386
23.0 6808 0.0001 0.3278
24.0 7104 0.0001 0.3078
25.0 7400 0.0001 0.3155
0.0068 1 0.0001 -
1.0 148 0.0001 0.3225
2.0 296 0.0001 0.3526
3.0 444 0.0001 0.3265
4.0 592 0.0001 0.3206
5.0 740 0.0001 0.3126
6.0 888 0.0001 0.3306
7.0 1036 0.0001 0.3189
8.0 1184 0.0001 0.3246
9.0 1332 0.0001 0.3346
10.0 1480 0.0001 0.3528
11.0 1628 0.0001 0.3204
12.0 1776 0.0001 0.34
13.0 1924 0.0001 0.3291
14.0 2072 0.0001 0.3444
15.0 2220 0.0001 0.339
16.0 2368 0.0001 0.3533
17.0 2516 0.0001 0.3288
18.0 2664 0.0001 0.3475
19.0 2812 0.0001 0.3464
20.0 2960 0.0001 0.3351
21.0 3108 0.0001 0.3421
22.0 3256 0.0001 0.3351
23.0 3404 0.0001 0.3416
24.0 3552 0.0001 0.3414
25.0 3700 0.0001 0.3433
26.0 3848 0.0001 0.3339
27.0 3996 0.0001 0.35
28.0 4144 0.0001 0.3215
29.0 4292 0.0001 0.3278
30.0 4440 0.0001 0.3508
31.0 4588 0.0001 0.3356
32.0 4736 0.0001 0.3617
33.0 4884 0.0001 0.3368
34.0 5032 0.0001 0.3551
35.0 5180 0.0001 0.3582
36.0 5328 0.0001 0.333
37.0 5476 0.0 0.3461
38.0 5624 0.0001 0.3515
39.0 5772 0.0001 0.3601
40.0 5920 0.0001 0.347
41.0 6068 0.0001 0.3444
42.0 6216 0.0 0.3609
43.0 6364 0.0 0.3432
44.0 6512 0.0 0.3526
45.0 6660 0.0 0.3382
46.0 6808 0.0 0.353
47.0 6956 0.0001 0.3374
48.0 7104 0.0001 0.327
49.0 7252 0.0001 0.3202
50.0 7400 0.0 0.3386
51.0 7548 0.0001 0.3501
52.0 7696 0.0002 0.3341
53.0 7844 0.0001 0.3024
54.0 7992 0.0001 0.3456
55.0 8140 0.0 0.3323
56.0 8288 0.0 0.3259
57.0 8436 0.0 0.3246
58.0 8584 0.0 0.3341
59.0 8732 0.0 0.3347
60.0 8880 0.0 0.322
61.0 9028 0.0001 0.3323
62.0 9176 0.0 0.3471
63.0 9324 0.0001 0.2913
64.0 9472 0.0 0.3144
65.0 9620 0.0001 0.3184
66.0 9768 0.0 0.3251
67.0 9916 0.0001 0.3342
68.0 10064 0.0 0.3486
69.0 10212 0.0 0.3381
70.0 10360 0.0 0.3161
71.0 10508 0.0 0.3036
72.0 10656 0.0 0.3141
73.0 10804 0.0 0.3307
74.0 10952 0.0 0.3153
75.0 11100 0.0 0.3016
76.0 11248 0.0001 0.3321
77.0 11396 0.0001 0.3194
78.0 11544 0.0001 0.3496
79.0 11692 0.0 0.3218
80.0 11840 0.0 0.3251
81.0 11988 0.0 0.3468
82.0 12136 0.0 0.3803
83.0 12284 0.0 0.3354
84.0 12432 0.0 0.351
85.0 12580 0.0 0.3231
86.0 12728 0.0 0.3027
87.0 12876 0.0 0.3309
88.0 13024 0.0 0.3194
89.0 13172 0.0 0.3611
90.0 13320 0.0 0.3288
91.0 13468 0.0 0.3261
92.0 13616 0.0 0.3268
93.0 13764 0.0 0.3433
94.0 13912 0.0 0.3438
95.0 14060 0.0 0.3288
96.0 14208 0.0 0.3263
97.0 14356 0.0 0.3331
98.0 14504 0.0 0.3334
99.0 14652 0.0 0.319
100.0 14800 0.0 0.3033
101.0 14948 0.0001 0.3051
102.0 15096 0.0 0.3321
103.0 15244 0.0 0.3181
104.0 15392 0.0 0.2943
105.0 15540 0.0 0.3137
106.0 15688 0.0 0.3111
107.0 15836 0.0 0.2968
108.0 15984 0.0 0.3072
109.0 16132 0.0 0.3154
110.0 16280 0.0001 0.3211
111.0 16428 0.0 0.2974
112.0 16576 0.0 0.3057
113.0 16724 0.0 0.296
114.0 16872 0.0 0.3104
115.0 17020 0.0 0.3029
116.0 17168 0.0 0.329
117.0 17316 0.0 0.3275
118.0 17464 0.0 0.3343
119.0 17612 0.0 0.3168
120.0 17760 0.0 0.3208
121.0 17908 0.0 0.2973
122.0 18056 0.0 0.3121
123.0 18204 0.0 0.3049
124.0 18352 0.0 0.3079
125.0 18500 0.0 0.2994
126.0 18648 0.0 0.3189
127.0 18796 0.0 0.3255
128.0 18944 0.0 0.3111
129.0 19092 0.0 0.3182
130.0 19240 0.0 0.356
131.0 19388 0.0 0.3299
132.0 19536 0.0 0.3308
133.0 19684 0.0 0.3379
134.0 19832 0.0 0.3233
135.0 19980 0.0 0.327
136.0 20128 0.0 0.318
137.0 20276 0.0 0.2937
138.0 20424 0.0 0.3039
139.0 20572 0.0 0.3367
140.0 20720 0.0 0.3185
141.0 20868 0.0 0.3441
142.0 21016 0.0 0.3055
143.0 21164 0.0 0.3202
144.0 21312 0.0 0.3144
145.0 21460 0.0 0.3304
146.0 21608 0.0 0.3165
147.0 21756 0.0 0.309
148.0 21904 0.0 0.3086
149.0 22052 0.0 0.2987
150.0 22200 0.0 0.3198
151.0 22348 0.0 0.3372
152.0 22496 0.0 0.3156
153.0 22644 0.0 0.3206
154.0 22792 0.0 0.322
155.0 22940 0.0 0.3445
156.0 23088 0.0 0.3183
157.0 23236 0.0 0.3203
158.0 23384 0.0 0.3337
159.0 23532 0.0 0.3245
160.0 23680 0.0 0.3068
161.0 23828 0.0 0.3199
162.0 23976 0.0 0.3308
163.0 24124 0.0 0.3446
164.0 24272 0.0 0.341
165.0 24420 0.0 0.3155
166.0 24568 0.0 0.3306
167.0 24716 0.0 0.3422
168.0 24864 0.0 0.336
169.0 25012 0.0 0.3271
170.0 25160 0.0 0.3062
171.0 25308 0.0 0.305
172.0 25456 0.0 0.3047
173.0 25604 0.0 0.3281
174.0 25752 0.0 0.3059
175.0 25900 0.0 0.2993
176.0 26048 0.0 0.3206
177.0 26196 0.0 0.3274
178.0 26344 0.0 0.3249
179.0 26492 0.0 0.3049
180.0 26640 0.0 0.3131
181.0 26788 0.0 0.3119
182.0 26936 0.0 0.3457
183.0 27084 0.0 0.3242
184.0 27232 0.0 0.3006
185.0 27380 0.0 0.3054
186.0 27528 0.0 0.3135
187.0 27676 0.0 0.3102
188.0 27824 0.0 0.3394
189.0 27972 0.0 0.3256
190.0 28120 0.0 0.2973
191.0 28268 0.0 0.3124
192.0 28416 0.0 0.321
193.0 28564 0.0 0.3332
194.0 28712 0.0 0.3136
195.0 28860 0.0 0.32
196.0 29008 0.0 0.3486
197.0 29156 0.0 0.3259
198.0 29304 0.0 0.3134
199.0 29452 0.0 0.3437
200.0 29600 0.0 0.3029
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 2.3.1
  • Transformers: 4.37.2
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.17.1
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
10
Safetensors
Model size
33.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for JordanTallon/Unifeed

Finetuned
(107)
this model

Evaluation results