Edit model card

SetFit with firqaaa/indo-sentence-bert-base

This is a SetFit model that can be used for Text Classification. This SetFit model uses firqaaa/indo-sentence-bert-base as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
kesedihan
  • 'Saya merasa agak kecewa, saya rasa harus menyerahkan sesuatu yang tidak menarik hanya untuk memenuhi tenggat waktu'
  • 'Aku merasa seperti aku telah cukup lalai terhadap blogku dan aku hanya mengatakan bahwa kita di sini hidup dan bahagia'
  • 'Aku tahu dan aku selalu terkoyak karenanya karena aku merasa tidak berdaya dan tidak berguna'
sukacita
  • 'aku mungkin tidak merasa begitu keren'
  • 'saya merasa baik-baik saja'
  • 'saya merasa seperti saya seorang ibu dengan mengorbankan produktivitas'
cinta
  • 'aku merasa mencintaimu'
  • 'aku akan merasa sangat nostalgia di usia yang begitu muda'
  • 'Saya merasa diberkati bahwa saya tinggal di Amerika memiliki keluarga yang luar biasa dan Dorothy Kelsey adalah bagian dari hidup saya'
amarah
  • 'Aku terlalu memikirkan cara dudukku, suaraku terdengar jika ada makanan di mulutku, dan perasaan bahwa aku harus berjalan ke semua orang agar tidak bersikap kasar'
  • 'aku merasa memberontak sedikit kesal gila terkurung'
  • 'Aku merasakan perasaan itu muncul kembali dari perasaan paranoid dan cemburu yang penuh kebencian yang selalu menyiksaku tanpa henti'
takut
  • 'aku merasa seperti diserang oleh landak titanium'
  • 'Aku membiarkan diriku memikirkan perilakuku terhadapmu saat kita masih kecil. Aku merasakan campuran aneh antara rasa bersalah dan kekaguman atas ketangguhanmu'
  • 'saya marah karena majikan saya tidak berinvestasi pada kami sama sekali, gaji pelatihan, kenaikan hari libur bank dan rasanya seperti ketidakadilan sehingga saya merasa tidak berdaya'
kejutan
  • 'Aku membaca bagian ol feefyefo Aku merasa takjub melihat betapa aku bisa mengoceh dan betapa transparannya aku dalam hidupku'
  • 'saya menemukan seni di sisi lain saya merasa sangat terkesan dengan karya saya'
  • 'aku merasa penasaran, bersemangat dan tidak sabar'

Evaluation

Metrics

Label Accuracy
all 0.718

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("firqaaa/indo-setfit-bert-base-p3")
# Run inference
preds = model("Aku melihat ke dalam dompetku dan aku merasakan hawa dingin")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 2 16.7928 56
Label Training Sample Count
kesedihan 300
sukacita 300
cinta 300
amarah 300
takut 300
kejutan 300

Training Hyperparameters

  • batch_size: (128, 128)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: True

Training Results

Epoch Step Training Loss Validation Loss
0.0000 1 0.2927 -
0.0024 50 0.2605 -
0.0047 100 0.2591 -
0.0071 150 0.2638 -
0.0095 200 0.245 -
0.0119 250 0.226 -
0.0142 300 0.222 -
0.0166 350 0.1968 -
0.0190 400 0.1703 -
0.0213 450 0.1703 -
0.0237 500 0.1587 -
0.0261 550 0.1087 -
0.0284 600 0.1203 -
0.0308 650 0.0844 -
0.0332 700 0.0696 -
0.0356 750 0.0606 -
0.0379 800 0.0333 -
0.0403 850 0.0453 -
0.0427 900 0.033 -
0.0450 950 0.0142 -
0.0474 1000 0.004 -
0.0498 1050 0.0097 -
0.0521 1100 0.0065 -
0.0545 1150 0.0081 -
0.0569 1200 0.0041 -
0.0593 1250 0.0044 -
0.0616 1300 0.0013 -
0.0640 1350 0.0024 -
0.0664 1400 0.001 -
0.0687 1450 0.0012 -
0.0711 1500 0.0013 -
0.0735 1550 0.0006 -
0.0759 1600 0.0033 -
0.0782 1650 0.0006 -
0.0806 1700 0.0013 -
0.0830 1750 0.0008 -
0.0853 1800 0.0006 -
0.0877 1850 0.0008 -
0.0901 1900 0.0004 -
0.0924 1950 0.0005 -
0.0948 2000 0.0004 -
0.0972 2050 0.0002 -
0.0996 2100 0.0002 -
0.1019 2150 0.0003 -
0.1043 2200 0.0006 -
0.1067 2250 0.0005 -
0.1090 2300 0.0003 -
0.1114 2350 0.0018 -
0.1138 2400 0.0003 -
0.1161 2450 0.0002 -
0.1185 2500 0.0018 -
0.1209 2550 0.0003 -
0.1233 2600 0.0008 -
0.1256 2650 0.0002 -
0.1280 2700 0.0007 -
0.1304 2750 0.006 -
0.1327 2800 0.0002 -
0.1351 2850 0.0001 -
0.1375 2900 0.0001 -
0.1399 2950 0.0001 -
0.1422 3000 0.0001 -
0.1446 3050 0.0001 -
0.1470 3100 0.0001 -
0.1493 3150 0.0001 -
0.1517 3200 0.0002 -
0.1541 3250 0.0003 -
0.1564 3300 0.0004 -
0.1588 3350 0.0001 -
0.1612 3400 0.0001 -
0.1636 3450 0.0014 -
0.1659 3500 0.0005 -
0.1683 3550 0.0003 -
0.1707 3600 0.0001 -
0.1730 3650 0.0001 -
0.1754 3700 0.0001 -
0.1778 3750 0.0001 -
0.1801 3800 0.0001 -
0.1825 3850 0.0001 -
0.1849 3900 0.0001 -
0.1873 3950 0.0001 -
0.1896 4000 0.0001 -
0.1920 4050 0.0001 -
0.1944 4100 0.0003 -
0.1967 4150 0.0006 -
0.1991 4200 0.0001 -
0.2015 4250 0.0 -
0.2038 4300 0.0 -
0.2062 4350 0.0001 -
0.2086 4400 0.0 -
0.2110 4450 0.0 -
0.2133 4500 0.0001 -
0.2157 4550 0.0002 -
0.2181 4600 0.0003 -
0.2204 4650 0.0018 -
0.2228 4700 0.0003 -
0.2252 4750 0.0145 -
0.2276 4800 0.0001 -
0.2299 4850 0.0006 -
0.2323 4900 0.0001 -
0.2347 4950 0.0007 -
0.2370 5000 0.0001 -
0.2394 5050 0.0 -
0.2418 5100 0.0 -
0.2441 5150 0.0001 -
0.2465 5200 0.0003 -
0.2489 5250 0.0 -
0.2513 5300 0.0 -
0.2536 5350 0.0 -
0.2560 5400 0.0 -
0.2584 5450 0.0004 -
0.2607 5500 0.0 -
0.2631 5550 0.0 -
0.2655 5600 0.0 -
0.2678 5650 0.0 -
0.2702 5700 0.0 -
0.2726 5750 0.0002 -
0.2750 5800 0.0 -
0.2773 5850 0.0 -
0.2797 5900 0.0 -
0.2821 5950 0.0 -
0.2844 6000 0.0 -
0.2868 6050 0.0 -
0.2892 6100 0.0 -
0.2916 6150 0.0 -
0.2939 6200 0.0 -
0.2963 6250 0.0 -
0.2987 6300 0.0001 -
0.3010 6350 0.0003 -
0.3034 6400 0.0048 -
0.3058 6450 0.0 -
0.3081 6500 0.0 -
0.3105 6550 0.0 -
0.3129 6600 0.0 -
0.3153 6650 0.0 -
0.3176 6700 0.0 -
0.3200 6750 0.0 -
0.3224 6800 0.0 -
0.3247 6850 0.0 -
0.3271 6900 0.0 -
0.3295 6950 0.0 -
0.3318 7000 0.0 -
0.3342 7050 0.0 -
0.3366 7100 0.0 -
0.3390 7150 0.0011 -
0.3413 7200 0.0002 -
0.3437 7250 0.0 -
0.3461 7300 0.0 -
0.3484 7350 0.0001 -
0.3508 7400 0.0001 -
0.3532 7450 0.0002 -
0.3556 7500 0.0 -
0.3579 7550 0.0 -
0.3603 7600 0.0 -
0.3627 7650 0.0 -
0.3650 7700 0.0 -
0.3674 7750 0.0 -
0.3698 7800 0.0001 -
0.3721 7850 0.0 -
0.3745 7900 0.0 -
0.3769 7950 0.0 -
0.3793 8000 0.0 -
0.3816 8050 0.0 -
0.3840 8100 0.0 -
0.3864 8150 0.0 -
0.3887 8200 0.0 -
0.3911 8250 0.0 -
0.3935 8300 0.0 -
0.3958 8350 0.0 -
0.3982 8400 0.0 -
0.4006 8450 0.0 -
0.4030 8500 0.0 -
0.4053 8550 0.0001 -
0.4077 8600 0.0001 -
0.4101 8650 0.0008 -
0.4124 8700 0.0001 -
0.4148 8750 0.0 -
0.4172 8800 0.0 -
0.4196 8850 0.0001 -
0.4219 8900 0.0 -
0.4243 8950 0.0 -
0.4267 9000 0.0 -
0.4290 9050 0.0 -
0.4314 9100 0.0 -
0.4338 9150 0.0 -
0.4361 9200 0.0 -
0.4385 9250 0.0 -
0.4409 9300 0.0 -
0.4433 9350 0.0 -
0.4456 9400 0.0 -
0.4480 9450 0.0 -
0.4504 9500 0.0 -
0.4527 9550 0.0 -
0.4551 9600 0.0 -
0.4575 9650 0.0 -
0.4598 9700 0.0 -
0.4622 9750 0.0001 -
0.4646 9800 0.0 -
0.4670 9850 0.0 -
0.4693 9900 0.0 -
0.4717 9950 0.0 -
0.4741 10000 0.0 -
0.4764 10050 0.0 -
0.4788 10100 0.0006 -
0.4812 10150 0.0 -
0.4835 10200 0.0 -
0.4859 10250 0.0 -
0.4883 10300 0.0 -
0.4907 10350 0.0 -
0.4930 10400 0.0 -
0.4954 10450 0.0 -
0.4978 10500 0.0 -
0.5001 10550 0.0 -
0.5025 10600 0.0 -
0.5049 10650 0.0 -
0.5073 10700 0.0 -
0.5096 10750 0.0 -
0.5120 10800 0.0 -
0.5144 10850 0.0 -
0.5167 10900 0.0 -
0.5191 10950 0.0 -
0.5215 11000 0.0 -
0.5238 11050 0.0 -
0.5262 11100 0.0 -
0.5286 11150 0.0 -
0.5310 11200 0.0 -
0.5333 11250 0.0 -
0.5357 11300 0.0 -
0.5381 11350 0.0 -
0.5404 11400 0.0 -
0.5428 11450 0.0 -
0.5452 11500 0.0 -
0.5475 11550 0.0 -
0.5499 11600 0.0 -
0.5523 11650 0.0001 -
0.5547 11700 0.0 -
0.5570 11750 0.0043 -
0.5594 11800 0.0 -
0.5618 11850 0.0 -
0.5641 11900 0.0 -
0.5665 11950 0.0 -
0.5689 12000 0.0 -
0.5713 12050 0.0 -
0.5736 12100 0.0 -
0.5760 12150 0.0 -
0.5784 12200 0.0 -
0.5807 12250 0.0029 -
0.5831 12300 0.0 -
0.5855 12350 0.0 -
0.5878 12400 0.0 -
0.5902 12450 0.0 -
0.5926 12500 0.0 -
0.5950 12550 0.0 -
0.5973 12600 0.0 -
0.5997 12650 0.0 -
0.6021 12700 0.0 -
0.6044 12750 0.0 -
0.6068 12800 0.0 -
0.6092 12850 0.0 -
0.6115 12900 0.0 -
0.6139 12950 0.0 -
0.6163 13000 0.0 -
0.6187 13050 0.0 -
0.6210 13100 0.0 -
0.6234 13150 0.0001 -
0.6258 13200 0.0 -
0.6281 13250 0.0 -
0.6305 13300 0.0 -
0.6329 13350 0.0 -
0.6353 13400 0.0001 -
0.6376 13450 0.0 -
0.6400 13500 0.0 -
0.6424 13550 0.0 -
0.6447 13600 0.0 -
0.6471 13650 0.0 -
0.6495 13700 0.0 -
0.6518 13750 0.0 -
0.6542 13800 0.0 -
0.6566 13850 0.0 -
0.6590 13900 0.0 -
0.6613 13950 0.0 -
0.6637 14000 0.0 -
0.6661 14050 0.0 -
0.6684 14100 0.0 -
0.6708 14150 0.0 -
0.6732 14200 0.0 -
0.6755 14250 0.0 -
0.6779 14300 0.0 -
0.6803 14350 0.0 -
0.6827 14400 0.0 -
0.6850 14450 0.0 -
0.6874 14500 0.0 -
0.6898 14550 0.0 -
0.6921 14600 0.0 -
0.6945 14650 0.0 -
0.6969 14700 0.0 -
0.6993 14750 0.0 -
0.7016 14800 0.0 -
0.7040 14850 0.0 -
0.7064 14900 0.0 -
0.7087 14950 0.0 -
0.7111 15000 0.0 -
0.7135 15050 0.0 -
0.7158 15100 0.0 -
0.7182 15150 0.0 -
0.7206 15200 0.0 -
0.7230 15250 0.0 -
0.7253 15300 0.0 -
0.7277 15350 0.0 -
0.7301 15400 0.0 -
0.7324 15450 0.0 -
0.7348 15500 0.0 -
0.7372 15550 0.0 -
0.7395 15600 0.0 -
0.7419 15650 0.0 -
0.7443 15700 0.0 -
0.7467 15750 0.0 -
0.7490 15800 0.0 -
0.7514 15850 0.0 -
0.7538 15900 0.0 -
0.7561 15950 0.0 -
0.7585 16000 0.0 -
0.7609 16050 0.0 -
0.7633 16100 0.0 -
0.7656 16150 0.0 -
0.7680 16200 0.0 -
0.7704 16250 0.0 -
0.7727 16300 0.0 -
0.7751 16350 0.0 -
0.7775 16400 0.0 -
0.7798 16450 0.0 -
0.7822 16500 0.0 -
0.7846 16550 0.0 -
0.7870 16600 0.0 -
0.7893 16650 0.0 -
0.7917 16700 0.0 -
0.7941 16750 0.0 -
0.7964 16800 0.0 -
0.7988 16850 0.0 -
0.8012 16900 0.0 -
0.8035 16950 0.0 -
0.8059 17000 0.0 -
0.8083 17050 0.0 -
0.8107 17100 0.0 -
0.8130 17150 0.0 -
0.8154 17200 0.0 -
0.8178 17250 0.0 -
0.8201 17300 0.0 -
0.8225 17350 0.0 -
0.8249 17400 0.0 -
0.8272 17450 0.0 -
0.8296 17500 0.0 -
0.8320 17550 0.0 -
0.8344 17600 0.0 -
0.8367 17650 0.0 -
0.8391 17700 0.0 -
0.8415 17750 0.0 -
0.8438 17800 0.0 -
0.8462 17850 0.0 -
0.8486 17900 0.0 -
0.8510 17950 0.0 -
0.8533 18000 0.0 -
0.8557 18050 0.0 -
0.8581 18100 0.0 -
0.8604 18150 0.0 -
0.8628 18200 0.0 -
0.8652 18250 0.0 -
0.8675 18300 0.0 -
0.8699 18350 0.0 -
0.8723 18400 0.0 -
0.8747 18450 0.0 -
0.8770 18500 0.0 -
0.8794 18550 0.0 -
0.8818 18600 0.0 -
0.8841 18650 0.0 -
0.8865 18700 0.0 -
0.8889 18750 0.0 -
0.8912 18800 0.0 -
0.8936 18850 0.0 -
0.8960 18900 0.0 -
0.8984 18950 0.0 -
0.9007 19000 0.0 -
0.9031 19050 0.0 -
0.9055 19100 0.0 -
0.9078 19150 0.0 -
0.9102 19200 0.0 -
0.9126 19250 0.0 -
0.9150 19300 0.0 -
0.9173 19350 0.0 -
0.9197 19400 0.0 -
0.9221 19450 0.0 -
0.9244 19500 0.0 -
0.9268 19550 0.0 -
0.9292 19600 0.0 -
0.9315 19650 0.0 -
0.9339 19700 0.0 -
0.9363 19750 0.0 -
0.9387 19800 0.0 -
0.9410 19850 0.0 -
0.9434 19900 0.0 -
0.9458 19950 0.0 -
0.9481 20000 0.0 -
0.9505 20050 0.0 -
0.9529 20100 0.0 -
0.9552 20150 0.0 -
0.9576 20200 0.0 -
0.9600 20250 0.0 -
0.9624 20300 0.0 -
0.9647 20350 0.0 -
0.9671 20400 0.0 -
0.9695 20450 0.0 -
0.9718 20500 0.0 -
0.9742 20550 0.0 -
0.9766 20600 0.0 -
0.9790 20650 0.0 -
0.9813 20700 0.0 -
0.9837 20750 0.0 -
0.9861 20800 0.0 -
0.9884 20850 0.0 -
0.9908 20900 0.0 -
0.9932 20950 0.0 -
0.9955 21000 0.0 -
0.9979 21050 0.0 -
1.0 21094 - 0.2251
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.13
  • SetFit: 1.0.3
  • Sentence Transformers: 2.2.2
  • Transformers: 4.36.2
  • PyTorch: 2.1.2+cu121
  • Datasets: 2.16.1
  • Tokenizers: 0.15.0

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
4
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for firqaaa/indo-setfit-bert-base-p3

Finetuned
(8)
this model

Evaluation results