YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Pretrained checkpoint: microsoft/deberta-large
Traning hyperparameters:
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- prompt_format: sentence aspect - sentiment
Training results
Epoch | Train loss | Subtask 3 f1 | Subtask 3 precision | Subtask 3 recall | Subtask4 accuracy |
---|---|---|---|---|---|
1 | 355.13336368463933 | 0.8388829215896885 | 0.9330943847072879 | 0.7619512195121951 | 0.8360975609756097 |
2 | 171.31726237665862 | 0.9082615306639635 | 0.9451476793248945 | 0.8741463414634146 | 0.8809756097560976 |
3 | 96.62600621895399 | 0.9191321499013807 | 0.9292123629112662 | 0.9092682926829269 | 0.8985365853658537 |
4 | 46.02018972591031 | 0.920743639921722 | 0.9234543670264965 | 0.9180487804878049 | 0.8848780487804878 |
5 | 18.8767371326976 | 0.9239130434782609 | 0.9359359359359359 | 0.9121951219512195 | 0.9014634146341464 |
- Downloads last month
- 23
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.