Faeze commited on
Commit
e1f50b1
1 Parent(s): 6cb583b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +63 -653
README.md CHANGED
@@ -6,16 +6,25 @@ tags:
6
  - text-classification
7
  - generated_from_setfit_trainer
8
  metrics:
9
- - metric
 
10
  widget:
11
- - text: Damn, my condolences to you bro
12
- - text: No Friday Im booked all day
13
- - text: Im sorry.
14
- - text: Hiding in the bush
15
- - text: '*"The conservative party is a cult." Says the group that bans words and follows
16
- socialism.??*'
 
 
 
 
 
 
 
 
17
  pipeline_tag: text-classification
18
- inference: true
19
  base_model: sentence-transformers/paraphrase-mpnet-base-v2
20
  model-index:
21
  - name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
@@ -29,669 +38,70 @@ model-index:
29
  split: test
30
  metrics:
31
  - type: metric
32
- value: 0.6947118450822154
33
  name: Metric
 
 
 
34
  ---
35
 
36
- # SetFit with sentence-transformers/paraphrase-mpnet-base-v2
37
 
38
- This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
39
 
40
- The model has been trained using an efficient few-shot learning technique that involves:
41
 
42
- 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
43
- 2. Training a classification head with features from the fine-tuned Sentence Transformer.
44
-
45
- ## Model Details
46
-
47
- ### Model Description
48
- - **Model Type:** SetFit
49
- - **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2)
50
- - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
51
- - **Maximum Sequence Length:** 512 tokens
52
- - **Number of Classes:** 8 classes
53
- <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
54
- <!-- - **Language:** Unknown -->
55
- <!-- - **License:** Unknown -->
56
-
57
- ### Model Sources
58
-
59
- - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
60
- - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
61
- - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
62
-
63
- ### Model Labels
64
- | Label | Examples |
65
- |:------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
66
- | 1 | <ul><li>'@Josh Collins "Ben 0" lmao don\'t forget the facts, Ben has more wins than that'</li><li>'poop siht are the fake news'</li><li>'Thank god these fire chiefs are being heard. People have no idea that they have been trying to meet up with the Prime Minister even before this bushfire crisis trying to alert the public of the devastating impacts of climate change.'</li></ul> |
67
- | 3 | <ul><li>'Perfectly nailed by Ms.Zainab Sikander. Proud !'</li><li>"You're so sincere Dia about people's life."</li><li>'No words to express my gratitude to this hero.'</li></ul> |
68
- | 6 | <ul><li>'I accept that.'</li><li>'@Viji same here'</li><li>'Facing same problem'</li></ul> |
69
- | 5 | <ul><li>"@Rhynni Yeah thanks for asking, Your profile picture actually caught my eyes, Where are you from if you wouldn't mind me asking?"</li><li>'For what what did they do?'</li><li>'Aditya Jagtap who?'</li></ul> |
70
- | 2 | <ul><li>'Or the save the world were gonna die people .......... No !!! the police joined in'</li><li>'No, I don\'t think I am missing the point at all. When they say "40% of people are obese" that\'s based on BMI, which is an inherently flawed measure by almost any standards. When you say "obesity is estimated to cost whatever," there\'s a lots of conflation of correlation and causation in that calculation. Diseases often correlated with obesity are not always caused by obesity. Either way, my point still stands. Weight should not be considered independently from all other measures of health, it\'s important to consider all the factors.'</li><li>"This is a scam under the guise of socialist action. Climate change is caused mainly by geothermal activity, hence can't be stopped."</li></ul> |
71
- | 4 | <ul><li>'https://www.gov.uk/guidance/high-consequence-infectious-diseases-hcid#status-of-covid-19 Please somebody explain this to me. It makes absolutely no sense.'</li><li>"Look, you're obviously interested in this, so why don't you go an get a degree in climate science? Im sure the OU do one."</li><li>'All airports need to be stopped'</li></ul> |
72
- | 0 | <ul><li>'Oh ... Following the same drama.'</li><li>'1st'</li><li>'Breaking news: England just left the EU!'</li></ul> |
73
- | 7 | <ul><li>'Oh no, I did not mean it that way, it was completely misunderstood what I was saying. Didnt mean to offend you, sorry!'</li><li>'Sorry, really.'</li><li>"It's my fault, I shouldn't have done that, sorryyy!"</li></ul> |
74
-
75
- ## Evaluation
76
-
77
- ### Metrics
78
- | Label | Metric |
79
- |:--------|:-------|
80
- | **all** | 0.6947 |
81
-
82
- ## Uses
83
-
84
- ### Direct Use for Inference
85
-
86
- First install the SetFit library:
87
-
88
- ```bash
89
- pip install setfit
90
- ```
91
-
92
- Then you can load this model and run inference.
93
-
94
- ```python
95
- from setfit import SetFitModel
96
-
97
- # Download from the 🤗 Hub
98
- model = SetFitModel.from_pretrained("CrisisNarratives/setfit-8classes-single_label")
99
- # Run inference
100
- preds = model("Im sorry.")
101
- ```
102
-
103
- <!--
104
- ### Downstream Use
105
-
106
- *List how someone could finetune this model on their own dataset.*
107
- -->
108
-
109
- <!--
110
- ### Out-of-Scope Use
111
-
112
- *List how the model may foreseeably be misused and address what users ought not to do with the model.*
113
- -->
114
-
115
- <!--
116
- ## Bias, Risks and Limitations
117
-
118
- *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
119
- -->
120
-
121
- <!--
122
- ### Recommendations
123
 
124
- *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
125
- -->
126
 
127
- ## Training Details
 
 
 
 
 
 
 
 
 
 
128
 
129
- ### Training Set Metrics
130
- | Training set | Min | Median | Max |
131
- |:-------------|:----|:--------|:-----|
132
- | Word count | 1 | 25.3789 | 1681 |
133
 
134
- | Label | Training Sample Count |
135
- |:------|:----------------------|
136
- | 0 | 156 |
137
- | 1 | 145 |
138
- | 2 | 52 |
139
- | 3 | 46 |
140
- | 4 | 63 |
141
- | 5 | 35 |
142
- | 6 | 37 |
143
- | 7 | 7 |
144
 
145
- ### Training Hyperparameters
146
- - batch_size: (16, 16)
147
- - num_epochs: (3, 3)
148
- - max_steps: -1
149
- - sampling_strategy: oversampling
150
- - num_iterations: 40
151
- - body_learning_rate: (1.752e-05, 1.752e-05)
152
- - head_learning_rate: 1.752e-05
153
- - loss: CosineSimilarityLoss
154
- - distance_metric: cosine_distance
155
- - margin: 0.25
156
- - end_to_end: False
157
- - use_amp: False
158
- - warmup_proportion: 0.1
159
- - seed: 30
160
- - eval_max_steps: -1
161
- - load_best_model_at_end: False
162
 
163
- ### Training Results
164
- | Epoch | Step | Training Loss | Validation Loss |
165
- |:------:|:----:|:-------------:|:---------------:|
166
- | 0.0004 | 1 | 0.4094 | - |
167
- | 0.0185 | 50 | 0.3207 | - |
168
- | 0.0370 | 100 | 0.2635 | - |
169
- | 0.0555 | 150 | 0.2347 | - |
170
- | 0.0739 | 200 | 0.2686 | - |
171
- | 0.0924 | 250 | 0.2575 | - |
172
- | 0.1109 | 300 | 0.1983 | - |
173
- | 0.1294 | 350 | 0.2387 | - |
174
- | 0.1479 | 400 | 0.2002 | - |
175
- | 0.1664 | 450 | 0.2112 | - |
176
- | 0.1848 | 500 | 0.0913 | - |
177
- | 0.2033 | 550 | 0.1715 | - |
178
- | 0.2218 | 600 | 0.0686 | - |
179
- | 0.2403 | 650 | 0.0166 | - |
180
- | 0.2588 | 700 | 0.0128 | - |
181
- | 0.2773 | 750 | 0.0102 | - |
182
- | 0.2957 | 800 | 0.0071 | - |
183
- | 0.3142 | 850 | 0.0012 | - |
184
- | 0.3327 | 900 | 0.0016 | - |
185
- | 0.3512 | 950 | 0.0035 | - |
186
- | 0.3697 | 1000 | 0.0012 | - |
187
- | 0.3882 | 1050 | 0.0003 | - |
188
- | 0.4067 | 1100 | 0.001 | - |
189
- | 0.4251 | 1150 | 0.0025 | - |
190
- | 0.4436 | 1200 | 0.001 | - |
191
- | 0.4621 | 1250 | 0.0006 | - |
192
- | 0.4806 | 1300 | 0.0006 | - |
193
- | 0.4991 | 1350 | 0.0004 | - |
194
- | 0.5176 | 1400 | 0.0012 | - |
195
- | 0.5360 | 1450 | 0.0051 | - |
196
- | 0.5545 | 1500 | 0.0009 | - |
197
- | 0.5730 | 1550 | 0.0003 | - |
198
- | 0.5915 | 1600 | 0.0004 | - |
199
- | 0.6100 | 1650 | 0.0009 | - |
200
- | 0.6285 | 1700 | 0.0002 | - |
201
- | 0.6470 | 1750 | 0.0003 | - |
202
- | 0.6654 | 1800 | 0.0005 | - |
203
- | 0.6839 | 1850 | 0.0003 | - |
204
- | 0.7024 | 1900 | 0.0003 | - |
205
- | 0.7209 | 1950 | 0.0005 | - |
206
- | 0.7394 | 2000 | 0.0004 | - |
207
- | 0.7579 | 2050 | 0.0008 | - |
208
- | 0.7763 | 2100 | 0.0009 | - |
209
- | 0.7948 | 2150 | 0.0002 | - |
210
- | 0.8133 | 2200 | 0.0002 | - |
211
- | 0.8318 | 2250 | 0.0002 | - |
212
- | 0.8503 | 2300 | 0.0008 | - |
213
- | 0.8688 | 2350 | 0.0002 | - |
214
- | 0.8872 | 2400 | 0.0002 | - |
215
- | 0.9057 | 2450 | 0.0003 | - |
216
- | 0.9242 | 2500 | 0.0013 | - |
217
- | 0.9427 | 2550 | 0.0003 | - |
218
- | 0.9612 | 2600 | 0.0002 | - |
219
- | 0.9797 | 2650 | 0.0002 | - |
220
- | 0.9982 | 2700 | 0.0003 | - |
221
- | 1.0166 | 2750 | 0.0002 | - |
222
- | 1.0351 | 2800 | 0.0008 | - |
223
- | 1.0536 | 2850 | 0.0001 | - |
224
- | 1.0721 | 2900 | 0.0004 | - |
225
- | 1.0906 | 2950 | 0.0001 | - |
226
- | 1.1091 | 3000 | 0.0001 | - |
227
- | 1.1275 | 3050 | 0.0002 | - |
228
- | 1.1460 | 3100 | 0.0002 | - |
229
- | 1.1645 | 3150 | 0.0002 | - |
230
- | 1.1830 | 3200 | 0.0001 | - |
231
- | 1.2015 | 3250 | 0.0001 | - |
232
- | 1.2200 | 3300 | 0.0001 | - |
233
- | 1.2384 | 3350 | 0.0041 | - |
234
- | 1.2569 | 3400 | 0.0002 | - |
235
- | 1.2754 | 3450 | 0.0001 | - |
236
- | 1.2939 | 3500 | 0.0001 | - |
237
- | 1.3124 | 3550 | 0.0002 | - |
238
- | 1.3309 | 3600 | 0.0 | - |
239
- | 1.3494 | 3650 | 0.0001 | - |
240
- | 1.3678 | 3700 | 0.0001 | - |
241
- | 1.3863 | 3750 | 0.0002 | - |
242
- | 1.4048 | 3800 | 0.0001 | - |
243
- | 1.4233 | 3850 | 0.0 | - |
244
- | 1.4418 | 3900 | 0.0001 | - |
245
- | 1.4603 | 3950 | 0.0001 | - |
246
- | 1.4787 | 4000 | 0.0001 | - |
247
- | 1.4972 | 4050 | 0.0001 | - |
248
- | 1.5157 | 4100 | 0.0001 | - |
249
- | 1.5342 | 4150 | 0.0001 | - |
250
- | 1.5527 | 4200 | 0.0001 | - |
251
- | 1.5712 | 4250 | 0.0001 | - |
252
- | 1.5896 | 4300 | 0.0001 | - |
253
- | 1.6081 | 4350 | 0.0 | - |
254
- | 1.6266 | 4400 | 0.0001 | - |
255
- | 1.6451 | 4450 | 0.0019 | - |
256
- | 1.6636 | 4500 | 0.0001 | - |
257
- | 1.6821 | 4550 | 0.0003 | - |
258
- | 1.7006 | 4600 | 0.0002 | - |
259
- | 1.7190 | 4650 | 0.0001 | - |
260
- | 1.7375 | 4700 | 0.0001 | - |
261
- | 1.7560 | 4750 | 0.0002 | - |
262
- | 1.7745 | 4800 | 0.0001 | - |
263
- | 1.7930 | 4850 | 0.0001 | - |
264
- | 1.8115 | 4900 | 0.0003 | - |
265
- | 1.8299 | 4950 | 0.056 | - |
266
- | 1.8484 | 5000 | 0.0001 | - |
267
- | 1.8669 | 5050 | 0.0001 | - |
268
- | 1.8854 | 5100 | 0.0001 | - |
269
- | 1.9039 | 5150 | 0.0001 | - |
270
- | 1.9224 | 5200 | 0.0 | - |
271
- | 1.9409 | 5250 | 0.0001 | - |
272
- | 1.9593 | 5300 | 0.0001 | - |
273
- | 1.9778 | 5350 | 0.0001 | - |
274
- | 1.9963 | 5400 | 0.0002 | - |
275
- | 2.0148 | 5450 | 0.0 | - |
276
- | 2.0333 | 5500 | 0.0001 | - |
277
- | 2.0518 | 5550 | 0.0 | - |
278
- | 2.0702 | 5600 | 0.0004 | - |
279
- | 2.0887 | 5650 | 0.0001 | - |
280
- | 2.1072 | 5700 | 0.0001 | - |
281
- | 2.1257 | 5750 | 0.0001 | - |
282
- | 2.1442 | 5800 | 0.0001 | - |
283
- | 2.1627 | 5850 | 0.0001 | - |
284
- | 2.1811 | 5900 | 0.0 | - |
285
- | 2.1996 | 5950 | 0.0001 | - |
286
- | 2.2181 | 6000 | 0.0001 | - |
287
- | 2.2366 | 6050 | 0.0001 | - |
288
- | 2.2551 | 6100 | 0.0001 | - |
289
- | 2.2736 | 6150 | 0.0001 | - |
290
- | 2.2921 | 6200 | 0.0 | - |
291
- | 2.3105 | 6250 | 0.0001 | - |
292
- | 2.3290 | 6300 | 0.0 | - |
293
- | 2.3475 | 6350 | 0.0001 | - |
294
- | 2.3660 | 6400 | 0.0001 | - |
295
- | 2.3845 | 6450 | 0.0001 | - |
296
- | 2.4030 | 6500 | 0.0 | - |
297
- | 2.4214 | 6550 | 0.0001 | - |
298
- | 2.4399 | 6600 | 0.0001 | - |
299
- | 2.4584 | 6650 | 0.0 | - |
300
- | 2.4769 | 6700 | 0.0 | - |
301
- | 2.4954 | 6750 | 0.0002 | - |
302
- | 2.5139 | 6800 | 0.0001 | - |
303
- | 2.5323 | 6850 | 0.0001 | - |
304
- | 2.5508 | 6900 | 0.0001 | - |
305
- | 2.5693 | 6950 | 0.0001 | - |
306
- | 2.5878 | 7000 | 0.0 | - |
307
- | 2.6063 | 7050 | 0.0001 | - |
308
- | 2.6248 | 7100 | 0.0001 | - |
309
- | 2.6433 | 7150 | 0.0001 | - |
310
- | 2.6617 | 7200 | 0.0001 | - |
311
- | 2.6802 | 7250 | 0.0001 | - |
312
- | 2.6987 | 7300 | 0.0003 | - |
313
- | 2.7172 | 7350 | 0.0001 | - |
314
- | 2.7357 | 7400 | 0.0 | - |
315
- | 2.7542 | 7450 | 0.0 | - |
316
- | 2.7726 | 7500 | 0.0 | - |
317
- | 2.7911 | 7550 | 0.0001 | - |
318
- | 2.8096 | 7600 | 0.0001 | - |
319
- | 2.8281 | 7650 | 0.0001 | - |
320
- | 2.8466 | 7700 | 0.0001 | - |
321
- | 2.8651 | 7750 | 0.0001 | - |
322
- | 2.8835 | 7800 | 0.0001 | - |
323
- | 2.9020 | 7850 | 0.0001 | - |
324
- | 2.9205 | 7900 | 0.0002 | - |
325
- | 2.9390 | 7950 | 0.0001 | - |
326
- | 2.9575 | 8000 | 0.0 | - |
327
- | 2.9760 | 8050 | 0.0 | - |
328
- | 2.9945 | 8100 | 0.0001 | - |
329
- | 0.0004 | 1 | 0.0001 | - |
330
- | 0.0185 | 50 | 0.0001 | - |
331
- | 0.0370 | 100 | 0.0001 | - |
332
- | 0.0555 | 150 | 0.0001 | - |
333
- | 0.0739 | 200 | 0.0001 | - |
334
- | 0.0924 | 250 | 0.0001 | - |
335
- | 0.1109 | 300 | 0.0001 | - |
336
- | 0.1294 | 350 | 0.0001 | - |
337
- | 0.1479 | 400 | 0.0001 | - |
338
- | 0.1664 | 450 | 0.0005 | - |
339
- | 0.1848 | 500 | 0.0007 | - |
340
- | 0.2033 | 550 | 0.0003 | - |
341
- | 0.2218 | 600 | 0.0003 | - |
342
- | 0.2403 | 650 | 0.0 | - |
343
- | 0.2588 | 700 | 0.0001 | - |
344
- | 0.2773 | 750 | 0.0001 | - |
345
- | 0.2957 | 800 | 0.0002 | - |
346
- | 0.3142 | 850 | 0.0 | - |
347
- | 0.3327 | 900 | 0.0001 | - |
348
- | 0.3512 | 950 | 0.0044 | - |
349
- | 0.3697 | 1000 | 0.0001 | - |
350
- | 0.3882 | 1050 | 0.0004 | - |
351
- | 0.4067 | 1100 | 0.0006 | - |
352
- | 0.4251 | 1150 | 0.0012 | - |
353
- | 0.4436 | 1200 | 0.0002 | - |
354
- | 0.4621 | 1250 | 0.0001 | - |
355
- | 0.4806 | 1300 | 0.0 | - |
356
- | 0.4991 | 1350 | 0.0001 | - |
357
- | 0.5176 | 1400 | 0.0003 | - |
358
- | 0.5360 | 1450 | 0.0001 | - |
359
- | 0.5545 | 1500 | 0.0001 | - |
360
- | 0.5730 | 1550 | 0.0002 | - |
361
- | 0.5915 | 1600 | 0.0001 | - |
362
- | 0.6100 | 1650 | 0.0002 | - |
363
- | 0.6285 | 1700 | 0.0 | - |
364
- | 0.6470 | 1750 | 0.0001 | - |
365
- | 0.6654 | 1800 | 0.0001 | - |
366
- | 0.6839 | 1850 | 0.0001 | - |
367
- | 0.7024 | 1900 | 0.0001 | - |
368
- | 0.7209 | 1950 | 0.0017 | - |
369
- | 0.7394 | 2000 | 0.0001 | - |
370
- | 0.7579 | 2050 | 0.0002 | - |
371
- | 0.7763 | 2100 | 0.0002 | - |
372
- | 0.7948 | 2150 | 0.0003 | - |
373
- | 0.8133 | 2200 | 0.0001 | - |
374
- | 0.8318 | 2250 | 0.0001 | - |
375
- | 0.8503 | 2300 | 0.0002 | - |
376
- | 0.8688 | 2350 | 0.0 | - |
377
- | 0.8872 | 2400 | 0.0001 | - |
378
- | 0.9057 | 2450 | 0.0001 | - |
379
- | 0.9242 | 2500 | 0.0002 | - |
380
- | 0.9427 | 2550 | 0.0001 | - |
381
- | 0.9612 | 2600 | 0.0 | - |
382
- | 0.9797 | 2650 | 0.0 | - |
383
- | 0.9982 | 2700 | 0.0001 | - |
384
- | 1.0166 | 2750 | 0.0001 | - |
385
- | 1.0351 | 2800 | 0.0001 | - |
386
- | 1.0536 | 2850 | 0.0 | - |
387
- | 1.0721 | 2900 | 0.0 | - |
388
- | 1.0906 | 2950 | 0.0001 | - |
389
- | 1.1091 | 3000 | 0.0 | - |
390
- | 1.1275 | 3050 | 0.0001 | - |
391
- | 1.1460 | 3100 | 0.0001 | - |
392
- | 1.1645 | 3150 | 0.0 | - |
393
- | 1.1830 | 3200 | 0.0 | - |
394
- | 1.2015 | 3250 | 0.0 | - |
395
- | 1.2200 | 3300 | 0.0 | - |
396
- | 1.2384 | 3350 | 0.0002 | - |
397
- | 1.2569 | 3400 | 0.0001 | - |
398
- | 1.2754 | 3450 | 0.0 | - |
399
- | 1.2939 | 3500 | 0.0001 | - |
400
- | 1.3124 | 3550 | 0.0001 | - |
401
- | 1.3309 | 3600 | 0.0 | - |
402
- | 1.3494 | 3650 | 0.0 | - |
403
- | 1.3678 | 3700 | 0.0 | - |
404
- | 1.3863 | 3750 | 0.0001 | - |
405
- | 1.4048 | 3800 | 0.0 | - |
406
- | 1.4233 | 3850 | 0.0 | - |
407
- | 1.4418 | 3900 | 0.0 | - |
408
- | 1.4603 | 3950 | 0.0 | - |
409
- | 1.4787 | 4000 | 0.0001 | - |
410
- | 1.4972 | 4050 | 0.0 | - |
411
- | 1.5157 | 4100 | 0.0 | - |
412
- | 1.5342 | 4150 | 0.0 | - |
413
- | 1.5527 | 4200 | 0.0001 | - |
414
- | 1.5712 | 4250 | 0.0001 | - |
415
- | 1.5896 | 4300 | 0.0 | - |
416
- | 1.6081 | 4350 | 0.0 | - |
417
- | 1.6266 | 4400 | 0.0001 | - |
418
- | 1.6451 | 4450 | 0.0 | - |
419
- | 1.6636 | 4500 | 0.0001 | - |
420
- | 1.6821 | 4550 | 0.0001 | - |
421
- | 1.7006 | 4600 | 0.0001 | - |
422
- | 1.7190 | 4650 | 0.0 | - |
423
- | 1.7375 | 4700 | 0.0 | - |
424
- | 1.7560 | 4750 | 0.0 | - |
425
- | 1.7745 | 4800 | 0.0 | - |
426
- | 1.7930 | 4850 | 0.0001 | - |
427
- | 1.8115 | 4900 | 0.0001 | - |
428
- | 1.8299 | 4950 | 0.0 | - |
429
- | 1.8484 | 5000 | 0.0001 | - |
430
- | 1.8669 | 5050 | 0.0 | - |
431
- | 1.8854 | 5100 | 0.0 | - |
432
- | 1.9039 | 5150 | 0.0 | - |
433
- | 1.9224 | 5200 | 0.0 | - |
434
- | 1.9409 | 5250 | 0.0 | - |
435
- | 1.9593 | 5300 | 0.0 | - |
436
- | 1.9778 | 5350 | 0.0 | - |
437
- | 1.9963 | 5400 | 0.0 | - |
438
- | 2.0148 | 5450 | 0.0 | - |
439
- | 2.0333 | 5500 | 0.0 | - |
440
- | 2.0518 | 5550 | 0.0 | - |
441
- | 2.0702 | 5600 | 0.0001 | - |
442
- | 2.0887 | 5650 | 0.0 | - |
443
- | 2.1072 | 5700 | 0.0 | - |
444
- | 2.1257 | 5750 | 0.0 | - |
445
- | 2.1442 | 5800 | 0.0 | - |
446
- | 2.1627 | 5850 | 0.0001 | - |
447
- | 2.1811 | 5900 | 0.0 | - |
448
- | 2.1996 | 5950 | 0.0 | - |
449
- | 2.2181 | 6000 | 0.0 | - |
450
- | 2.2366 | 6050 | 0.0 | - |
451
- | 2.2551 | 6100 | 0.0 | - |
452
- | 2.2736 | 6150 | 0.0001 | - |
453
- | 2.2921 | 6200 | 0.0 | - |
454
- | 2.3105 | 6250 | 0.0 | - |
455
- | 2.3290 | 6300 | 0.0 | - |
456
- | 2.3475 | 6350 | 0.0 | - |
457
- | 2.3660 | 6400 | 0.0 | - |
458
- | 2.3845 | 6450 | 0.0 | - |
459
- | 2.4030 | 6500 | 0.0 | - |
460
- | 2.4214 | 6550 | 0.0 | - |
461
- | 2.4399 | 6600 | 0.0 | - |
462
- | 2.4584 | 6650 | 0.0 | - |
463
- | 2.4769 | 6700 | 0.0 | - |
464
- | 2.4954 | 6750 | 0.0001 | - |
465
- | 2.5139 | 6800 | 0.0001 | - |
466
- | 2.5323 | 6850 | 0.0 | - |
467
- | 2.5508 | 6900 | 0.0 | - |
468
- | 2.5693 | 6950 | 0.0 | - |
469
- | 2.5878 | 7000 | 0.0 | - |
470
- | 2.6063 | 7050 | 0.0 | - |
471
- | 2.6248 | 7100 | 0.0 | - |
472
- | 2.6433 | 7150 | 0.0001 | - |
473
- | 2.6617 | 7200 | 0.0 | - |
474
- | 2.6802 | 7250 | 0.0 | - |
475
- | 2.6987 | 7300 | 0.0001 | - |
476
- | 2.7172 | 7350 | 0.0 | - |
477
- | 2.7357 | 7400 | 0.0 | - |
478
- | 2.7542 | 7450 | 0.0 | - |
479
- | 2.7726 | 7500 | 0.0 | - |
480
- | 2.7911 | 7550 | 0.0 | - |
481
- | 2.8096 | 7600 | 0.0 | - |
482
- | 2.8281 | 7650 | 0.0 | - |
483
- | 2.8466 | 7700 | 0.0001 | - |
484
- | 2.8651 | 7750 | 0.0 | - |
485
- | 2.8835 | 7800 | 0.0001 | - |
486
- | 2.9020 | 7850 | 0.0 | - |
487
- | 2.9205 | 7900 | 0.0001 | - |
488
- | 2.9390 | 7950 | 0.0001 | - |
489
- | 2.9575 | 8000 | 0.0 | - |
490
- | 2.9760 | 8050 | 0.0 | - |
491
- | 2.9945 | 8100 | 0.0 | - |
492
- | 0.0004 | 1 | 0.0 | - |
493
- | 0.0185 | 50 | 0.0 | - |
494
- | 0.0370 | 100 | 0.0 | - |
495
- | 0.0555 | 150 | 0.0 | - |
496
- | 0.0739 | 200 | 0.0 | - |
497
- | 0.0924 | 250 | 0.0 | - |
498
- | 0.1109 | 300 | 0.0 | - |
499
- | 0.1294 | 350 | 0.0005 | - |
500
- | 0.1479 | 400 | 0.0002 | - |
501
- | 0.1664 | 450 | 0.0001 | - |
502
- | 0.1848 | 500 | 0.0009 | - |
503
- | 0.2033 | 550 | 0.1068 | - |
504
- | 0.2218 | 600 | 0.0 | - |
505
- | 0.2403 | 650 | 0.0 | - |
506
- | 0.2588 | 700 | 0.0 | - |
507
- | 0.2773 | 750 | 0.0374 | - |
508
- | 0.2957 | 800 | 0.0001 | - |
509
- | 0.3142 | 850 | 0.0 | - |
510
- | 0.3327 | 900 | 0.0 | - |
511
- | 0.3512 | 950 | 0.0 | - |
512
- | 0.3697 | 1000 | 0.0001 | - |
513
- | 0.3882 | 1050 | 0.0 | - |
514
- | 0.4067 | 1100 | 0.0001 | - |
515
- | 0.4251 | 1150 | 0.0002 | - |
516
- | 0.4436 | 1200 | 0.0001 | - |
517
- | 0.4621 | 1250 | 0.0012 | - |
518
- | 0.4806 | 1300 | 0.0 | - |
519
- | 0.4991 | 1350 | 0.0001 | - |
520
- | 0.5176 | 1400 | 0.0001 | - |
521
- | 0.5360 | 1450 | 0.0 | - |
522
- | 0.5545 | 1500 | 0.0001 | - |
523
- | 0.5730 | 1550 | 0.0 | - |
524
- | 0.5915 | 1600 | 0.0267 | - |
525
- | 0.6100 | 1650 | 0.0001 | - |
526
- | 0.6285 | 1700 | 0.0 | - |
527
- | 0.6470 | 1750 | 0.0 | - |
528
- | 0.6654 | 1800 | 0.0 | - |
529
- | 0.6839 | 1850 | 0.0 | - |
530
- | 0.7024 | 1900 | 0.0 | - |
531
- | 0.7209 | 1950 | 0.0 | - |
532
- | 0.7394 | 2000 | 0.0 | - |
533
- | 0.7579 | 2050 | 0.0001 | - |
534
- | 0.7763 | 2100 | 0.0 | - |
535
- | 0.7948 | 2150 | 0.0001 | - |
536
- | 0.8133 | 2200 | 0.0001 | - |
537
- | 0.8318 | 2250 | 0.0 | - |
538
- | 0.8503 | 2300 | 0.0001 | - |
539
- | 0.8688 | 2350 | 0.1116 | - |
540
- | 0.8872 | 2400 | 0.0042 | - |
541
- | 0.9057 | 2450 | 0.0001 | - |
542
- | 0.9242 | 2500 | 0.0006 | - |
543
- | 0.9427 | 2550 | 0.0 | - |
544
- | 0.9612 | 2600 | 0.0615 | - |
545
- | 0.9797 | 2650 | 0.0002 | - |
546
- | 0.9982 | 2700 | 0.0 | - |
547
- | 1.0166 | 2750 | 0.0003 | - |
548
- | 1.0351 | 2800 | 0.0001 | - |
549
- | 1.0536 | 2850 | 0.0 | - |
550
- | 1.0721 | 2900 | 0.0 | - |
551
- | 1.0906 | 2950 | 0.0 | - |
552
- | 1.1091 | 3000 | 0.0 | - |
553
- | 1.1275 | 3050 | 0.0001 | - |
554
- | 1.1460 | 3100 | 0.0 | - |
555
- | 1.1645 | 3150 | 0.0 | - |
556
- | 1.1830 | 3200 | 0.0 | - |
557
- | 1.2015 | 3250 | 0.0 | - |
558
- | 1.2200 | 3300 | 0.0 | - |
559
- | 1.2384 | 3350 | 0.0 | - |
560
- | 1.2569 | 3400 | 0.0 | - |
561
- | 1.2754 | 3450 | 0.0 | - |
562
- | 1.2939 | 3500 | 0.0 | - |
563
- | 1.3124 | 3550 | 0.0 | - |
564
- | 1.3309 | 3600 | 0.0 | - |
565
- | 1.3494 | 3650 | 0.0 | - |
566
- | 1.3678 | 3700 | 0.0 | - |
567
- | 1.3863 | 3750 | 0.0 | - |
568
- | 1.4048 | 3800 | 0.0003 | - |
569
- | 1.4233 | 3850 | 0.0 | - |
570
- | 1.4418 | 3900 | 0.0001 | - |
571
- | 1.4603 | 3950 | 0.0 | - |
572
- | 1.4787 | 4000 | 0.0001 | - |
573
- | 1.4972 | 4050 | 0.0 | - |
574
- | 1.5157 | 4100 | 0.0 | - |
575
- | 1.5342 | 4150 | 0.0 | - |
576
- | 1.5527 | 4200 | 0.0 | - |
577
- | 1.5712 | 4250 | 0.0 | - |
578
- | 1.5896 | 4300 | 0.0 | - |
579
- | 1.6081 | 4350 | 0.0 | - |
580
- | 1.6266 | 4400 | 0.0 | - |
581
- | 1.6451 | 4450 | 0.0 | - |
582
- | 1.6636 | 4500 | 0.0 | - |
583
- | 1.6821 | 4550 | 0.0001 | - |
584
- | 1.7006 | 4600 | 0.0 | - |
585
- | 1.7190 | 4650 | 0.0 | - |
586
- | 1.7375 | 4700 | 0.0 | - |
587
- | 1.7560 | 4750 | 0.0 | - |
588
- | 1.7745 | 4800 | 0.0 | - |
589
- | 1.7930 | 4850 | 0.0 | - |
590
- | 1.8115 | 4900 | 0.0 | - |
591
- | 1.8299 | 4950 | 0.0 | - |
592
- | 1.8484 | 5000 | 0.0 | - |
593
- | 1.8669 | 5050 | 0.0 | - |
594
- | 1.8854 | 5100 | 0.0 | - |
595
- | 1.9039 | 5150 | 0.0 | - |
596
- | 1.9224 | 5200 | 0.0 | - |
597
- | 1.9409 | 5250 | 0.0 | - |
598
- | 1.9593 | 5300 | 0.0 | - |
599
- | 1.9778 | 5350 | 0.0 | - |
600
- | 1.9963 | 5400 | 0.0 | - |
601
- | 2.0148 | 5450 | 0.0 | - |
602
- | 2.0333 | 5500 | 0.0 | - |
603
- | 2.0518 | 5550 | 0.0 | - |
604
- | 2.0702 | 5600 | 0.0001 | - |
605
- | 2.0887 | 5650 | 0.0 | - |
606
- | 2.1072 | 5700 | 0.0 | - |
607
- | 2.1257 | 5750 | 0.0 | - |
608
- | 2.1442 | 5800 | 0.0001 | - |
609
- | 2.1627 | 5850 | 0.0 | - |
610
- | 2.1811 | 5900 | 0.0 | - |
611
- | 2.1996 | 5950 | 0.0 | - |
612
- | 2.2181 | 6000 | 0.0 | - |
613
- | 2.2366 | 6050 | 0.0 | - |
614
- | 2.2551 | 6100 | 0.0 | - |
615
- | 2.2736 | 6150 | 0.0 | - |
616
- | 2.2921 | 6200 | 0.0 | - |
617
- | 2.3105 | 6250 | 0.0 | - |
618
- | 2.3290 | 6300 | 0.0 | - |
619
- | 2.3475 | 6350 | 0.0 | - |
620
- | 2.3660 | 6400 | 0.0 | - |
621
- | 2.3845 | 6450 | 0.0 | - |
622
- | 2.4030 | 6500 | 0.0 | - |
623
- | 2.4214 | 6550 | 0.0 | - |
624
- | 2.4399 | 6600 | 0.0 | - |
625
- | 2.4584 | 6650 | 0.0 | - |
626
- | 2.4769 | 6700 | 0.0 | - |
627
- | 2.4954 | 6750 | 0.0 | - |
628
- | 2.5139 | 6800 | 0.0001 | - |
629
- | 2.5323 | 6850 | 0.0 | - |
630
- | 2.5508 | 6900 | 0.0 | - |
631
- | 2.5693 | 6950 | 0.0 | - |
632
- | 2.5878 | 7000 | 0.0 | - |
633
- | 2.6063 | 7050 | 0.0 | - |
634
- | 2.6248 | 7100 | 0.0 | - |
635
- | 2.6433 | 7150 | 0.0 | - |
636
- | 2.6617 | 7200 | 0.0 | - |
637
- | 2.6802 | 7250 | 0.0 | - |
638
- | 2.6987 | 7300 | 0.0 | - |
639
- | 2.7172 | 7350 | 0.0 | - |
640
- | 2.7357 | 7400 | 0.0 | - |
641
- | 2.7542 | 7450 | 0.0 | - |
642
- | 2.7726 | 7500 | 0.0 | - |
643
- | 2.7911 | 7550 | 0.0 | - |
644
- | 2.8096 | 7600 | 0.0 | - |
645
- | 2.8281 | 7650 | 0.0 | - |
646
- | 2.8466 | 7700 | 0.0 | - |
647
- | 2.8651 | 7750 | 0.0 | - |
648
- | 2.8835 | 7800 | 0.0 | - |
649
- | 2.9020 | 7850 | 0.0 | - |
650
- | 2.9205 | 7900 | 0.0 | - |
651
- | 2.9390 | 7950 | 0.0 | - |
652
- | 2.9575 | 8000 | 0.0 | - |
653
- | 2.9760 | 8050 | 0.0 | - |
654
- | 2.9945 | 8100 | 0.0 | - |
655
 
656
- ### Framework Versions
657
- - Python: 3.9.16
658
- - SetFit: 1.0.1
659
- - Sentence Transformers: 2.2.2
660
- - Transformers: 4.35.0
661
- - PyTorch: 2.1.0+cu121
662
- - Datasets: 2.14.6
663
- - Tokenizers: 0.14.1
664
 
665
- ## Citation
 
 
 
 
 
666
 
667
- ### BibTeX
668
- ```bibtex
669
- @article{https://doi.org/10.48550/arxiv.2209.11055,
670
- doi = {10.48550/ARXIV.2209.11055},
671
- url = {https://arxiv.org/abs/2209.11055},
672
- author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
673
- keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
674
- title = {Efficient Few-Shot Learning Without Prompts},
675
- publisher = {arXiv},
676
- year = {2022},
677
- copyright = {Creative Commons Attribution 4.0 International}
678
- }
679
- ```
680
 
681
- <!--
682
- ## Glossary
683
 
684
- *Clearly define terms in order to be accessible across audiences.*
685
- -->
686
 
687
- <!--
688
- ## Model Card Authors
689
 
690
- *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
691
- -->
692
 
693
- <!--
694
- ## Model Card Contact
695
 
696
- *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
697
- -->
 
6
  - text-classification
7
  - generated_from_setfit_trainer
8
  metrics:
9
+ - f1
10
+ - accuracy
11
  widget:
12
+ - text: >-
13
+ A combined 20 million people per year die of smoking and hunger, so
14
+ authorities can't seem to feed people and they allow you to buy cigarettes
15
+ but we are facing another lockdown for a virus that has a 99.5% survival
16
+ rate!!! THINK PEOPLE. LOOK AT IT LOGICALLY WITH YOUR OWN EYES.
17
+ - text: >-
18
+ Scientists do not agree on the consequences of climate change, nor is there
19
+ any consensus on that subject. The predictions on that from are just
20
+ ascientific speculation. Bring on the warming."
21
+ - text: >-
22
+ If Tam is our "top doctor"....I am going back to leaches and voodoo...just
23
+ as much science in that as the crap she spouts
24
+ - text: "Can she skip school by herself and sit infront of parliament? \r\n Fake emotions and just a good actor."
25
+ - text: my dad had huge ones..so they may be real..
26
  pipeline_tag: text-classification
27
+ inference: false
28
  base_model: sentence-transformers/paraphrase-mpnet-base-v2
29
  model-index:
30
  - name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
 
38
  split: test
39
  metrics:
40
  - type: metric
41
+ value: 0.688144336139226
42
  name: Metric
43
+ license: mit
44
+ language:
45
+ - en
46
  ---
47
 
48
+ # Computational Analysis of Communicative Acts for Understanding Crisis News Comment Discourses
49
 
50
+ The official trained models for **"Computational Analysis of Communicative Acts for Understanding Crisis News Comment Discourses"**.
51
 
52
+ This model is based on **SetFit** ([SetFit: Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)) and uses the **sentence-transformers/paraphrase-mpnet-base-v2** pretrained model. It has been fine-tuned on our **crisis narratives dataset**.
53
 
54
+ ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
55
 
56
+ ### Model Information
 
57
 
58
+ - **Architecture:** SetFit with sentence-transformers/paraphrase-mpnet-base-v2
59
+ - **Task:** Single-label classification for communicative act actions
60
+ - **Classes:**
61
+ - `informing statement`
62
+ - `challenge`
63
+ - `rejection`
64
+ - `appreciation`
65
+ - `request`
66
+ - `question`
67
+ - `acceptance`
68
+ - `apology`
69
 
70
+ ---
 
 
 
71
 
72
+ ### How to Use the Model
 
 
 
 
 
 
 
 
 
73
 
74
+ You can find the code to fine-tune this model and detailed instructions in the following GitHub repository:
75
+ [Acts in Crisis Narratives - SetFit Fine-Tuning Notebook](https://github.com/Aalto-CRAI-CIS/Acts-in-crisis-narratives/blob/main/few_shot_learning/SetFit.ipynb)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
76
 
77
+ #### Steps to Load and Use the Model:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
 
79
+ 1. Install the SetFit library:
80
+ ```bash
81
+ pip install setfit
82
+ ```
83
+
84
+ 2. Load the model and run inference:
85
+ ```python
86
+ from setfit import SetFitModel
87
 
88
+ # Download from the 🤗 Hub
89
+ model = SetFitModel.from_pretrained("CrisisNarratives/setfit-8classes-single_label")
90
+
91
+ # Run inference
92
+ preds = model("I'm sorry.")
93
+ ```
94
 
95
+ For detailed instructions, refer to the GitHub repository linked above.
 
 
 
 
 
 
 
 
 
 
 
 
96
 
97
+ ---
 
98
 
99
+ ### Citation
 
100
 
101
+ If you use this model in your work, please cite:
 
102
 
103
+ ##### TO BE ADDED.
 
104
 
105
+ ### Questions or Feedback?
 
106
 
107
+ For questions or feedback, please reach out via our [contact form](mailto:[email protected]).