Edit model card

SetFit with firqaaa/indo-sentence-bert-base for indonlu/smsa

Author

Kelompok 3 :

  • Muhammad Guntur Arfianto (20/459272/PA/19933)
  • Putri Iqlima Miftahuddini (23/531392/NUGM/01467)
  • Alan Kurniawan (23/531301/NUGM/01382)

This is a SetFit model that can be used for Text Classification. This SetFit model uses firqaaa/indo-sentence-bert-base as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

The dataset that was used for fine-tuning this model is indonlu, specifically its subset, SmSa dataset.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
2
  • 'hampir semua musala di stasiun jalur ke bogor kondisi nya juga terlalu sempit dan fasilitas wudhu yang kurang . bahkan sekelas stasiun besar bogor .'
  • 'tangkap saja pak si penyanyi gadungan itu . kerjaan nya cuma fitnah di media sosial saja .'
  • 'saya di cgv marvel city sby mau verifikasi sms redam , tapi di informasi telkomsel trobel , menyebalkan !'
1
  • 'bapak berkumis lebat itu menyebrang menggunakan zebra cross'
  • 'kaitan kalung cantik bahan perak / silver 925'
  • 'duo red bull mendominasi latihan bebas pertama f1 gp singapura'
0
  • 'jokowi sayang dan cinta kepada rakyat nya'
  • 'nyaman banget kalau lagi nongkrong kenyang di warung upnormal . mulai dari pilihan menu nya yang serius banget digarap , dari pelayan2 nya yang kece , sampai ke interior nya yang super . rekomendasi banget deh kalau mau mengerjakan tugas , arisan , ulang tahun , reunian di sini .'
  • 'rasanya lumayan . sambel nya juga enak . apalagi disajikan 3 macam model begitu . terus banyak pilihan sih sebenarnya mau makan apa di sini . mau gurame , mau kakap , bawal , kerang , cumi , udang . macem-macem deh . asal jangan pesan ikan kembung saja . tidak ada di sini .'

Evaluation

Metrics

Label Accuracy Precision Recall F1
all 0.7677 0.7677 0.7677 0.7677

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("TRUEnder/setfit-indosentencebert-indonlusmsa-16-shot")
# Run inference
preds = model("liverpool sukses di kandang tottenham")

Training Details

Training Set Metrics

Label Training Sample Count
0 16
1 16
2 16

Training Hyperparameters

  • batch_size: (16, 2)
  • num_epochs: (6, 16)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: True

Training Results (Epoch-to-epoch)

Epoch Step Training Loss Validation Loss
1.0 96 0.0009 0.1923
2.0 192 0.0002 0.1977
3.0 288 0.0002 0.2011
4.0 384 0.0002 0.203
5.0 480 0.0001 0.2042
6.0 576 0.0001 0.2046
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.3.0+cu121
  • Datasets: 2.19.2
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
2
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for TRUEnder/setfit-indosentencebert-indonlusmsa-16-shot

Finetuned
(8)
this model

Evaluation results