File size: 1,661 Bytes
dc80401
 
 
f46e6ba
 
dc80401
 
 
 
21fa98e
dc80401
 
 
 
 
21fa98e
dc80401
 
 
 
21fa98e
2287ad4
21fa98e
dc80401
 
3fa2f85
dc80401
42b3dc0
2287ad4
dc80401
2287ad4
dc80401
 
 
42b3dc0
 
 
 
 
dc80401
163dcbe
9e8f7da
 
163dcbe
 
9e8f7da
76fefb9
 
 
7f8381a
76fefb9
 
 
 
 
 
 
7f8381a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
---
tags:
- distilbert
- health
- tweet
datasets:
- custom-phm-tweets
metrics:
- accuracy
base_model: distilbert-base-uncased
model-index:
- name: distilbert-phmtweets-sutd
  results:
  - task:
      type: text-classification
      name: Text Classification
    dataset:
      name: custom-phm-tweets
      type: labelled
    metrics:
    - type: accuracy
      value: 0.877
      name: Accuracy
---

# distilbert-phmtweets-sutd

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) for text classification to identify public health events through tweets. The project was based on an [Emory University Study on Detection of Personal Health Mentions in Social Media paper](https://arxiv.org/pdf/1802.09130v2.pdf), that worked with this [custom dataset](https://github.com/emory-irlab/PHM2017).

It achieves the following results on the evaluation set:
- Accuracy: 0.877

## Usage

```Python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("dibsondivya/distilbert-phmtweets-sutd")
model = AutoModelForSequenceClassification.from_pretrained("dibsondivya/distilbert-phmtweets-sutd")
```


### Model Evaluation Results
With Validation Set
- Accuracy: 0.8708661417322835

With Test Set
- Accuracy: 0.8772961058045555

# Reference for distilbert-base-uncased Model
```bibtex
@article{Sanh2019DistilBERTAD,
  title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
  author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.01108}
}
```