File size: 2,014 Bytes
db888ee
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
---
license: mit
datasets:
- dair-ai/emotion
language:
- en
metrics:
- accuracy
pipeline_tag: text-classification
---

## Model Description

<!-- Provide a longer summary of what this model is. -->
Bert is a Transformer Bidirectional Encoder based Architecture trained on MLM(Mask Language Modeling) objective

[bert-base-uncased-emotion-fituned](https://huggingface.co/sonia12138/bert-base-uncased-emotion-fituned) finetuned on the emotion dataset using HuggingFace Trainer with below training parameters
```
    num_train_epochs=8,              
    train_batch_size=32,  
    eval_batch_size=64,   
    warmup_steps=500,                
    weight_decay=0.01
```

## Dataset
[emotion](https://huggingface.co/datasets/dair-ai/emotion)

## Model Performance Comparision on Emotion Dataset
| Model | Accuracy | Recall | F1 Score | 
| ------------ | -------- | -------- | -------- | 
| Bert-base-uncased-emotion **(SOTA)**|   92.6  |   87.9  |        88.2                      
| **Bert-base-uncased-emotion-fintuned**|   92.9  |  88  |          88.5       

## How to Use the Model:
```
from transformers import pipeline
classifier = pipeline("text-classification",model='sonia12138/bert-base-uncased-emotion-fituned', return_all_scores=True)
prediction = classifier("I love using transformers. The best part is wide range of support and its easy to use", )
print(prediction)
```
## Model Sources

<!-- Provide the basic links for the model. -->

- **Repository:** [More Information Needed](https://github.com/SoniaWang121/lyrics-emo-bert/tree/main)

## Eval Results
```
{
  'eval_accuracy': 0.929,
  'eval_f1': 0.9405920712282673,
  'eval_loss': 0.15769127011299133,
  'eval_loss': 0.37796708941459656,
  "eval_runtime': 8.0514,
  'eval_samples_per_second': 248.403,
  'eval_steps_per_second': 3.974,
 }
```

## Compute Infrastructure
### Hardware
NVIDIA GeForce RTX 4090
### Software
22.04.1-Ubuntu
## Model Card Authors

[Xiaohan Wang](https://github.com/SoniaWang121), [Kun Peng](https://github.com/Eric-Pk)