qfrodicio commited on
Commit
aca1777
1 Parent(s): 00fadc7

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -16
README.md CHANGED
@@ -3,10 +3,10 @@ license: mit
3
  tags:
4
  - generated_from_trainer
5
  metrics:
 
6
  - precision
7
  - recall
8
  - f1
9
- - accuracy
10
  model-index:
11
  - name: roberta-finetuned-gesture-prediction-5-classes
12
  results: []
@@ -19,11 +19,11 @@ should probably proofread and complete it, then remove this comment. -->
19
 
20
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.4016
23
- - Precision: 0.6731
24
- - Recall: 0.7778
25
- - F1: 0.7216
26
- - Accuracy: 0.8912
27
 
28
  ## Model description
29
 
@@ -42,22 +42,28 @@ More information needed
42
  ### Training hyperparameters
43
 
44
  The following hyperparameters were used during training:
45
- - learning_rate: 8.933445816612466e-05
46
- - train_batch_size: 32
47
- - eval_batch_size: 32
48
  - seed: 42
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
- - num_epochs: 4
52
 
53
  ### Training results
54
 
55
- | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
56
- |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
57
- | 1.2011 | 1.0 | 36 | 0.7200 | 0.4537 | 0.5636 | 0.5027 | 0.7552 |
58
- | 0.473 | 2.0 | 72 | 0.5067 | 0.6552 | 0.7333 | 0.6921 | 0.8627 |
59
- | 0.2919 | 3.0 | 108 | 0.4244 | 0.6267 | 0.7394 | 0.6784 | 0.8721 |
60
- | 0.1748 | 4.0 | 144 | 0.4016 | 0.6731 | 0.7778 | 0.7216 | 0.8912 |
 
 
 
 
 
 
61
 
62
 
63
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  metrics:
6
+ - accuracy
7
  - precision
8
  - recall
9
  - f1
 
10
  model-index:
11
  - name: roberta-finetuned-gesture-prediction-5-classes
12
  results: []
 
19
 
20
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
21
  It achieves the following results on the evaluation set:
22
+ - Loss: 0.5974
23
+ - Accuracy: 0.8778
24
+ - Precision: 0.8775
25
+ - Recall: 0.8778
26
+ - F1: 0.8771
27
 
28
  ## Model description
29
 
 
42
  ### Training hyperparameters
43
 
44
  The following hyperparameters were used during training:
45
+ - learning_rate: 2e-05
46
+ - train_batch_size: 16
47
+ - eval_batch_size: 16
48
  - seed: 42
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
+ - num_epochs: 10
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
56
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
57
+ | 1.4556 | 1.0 | 71 | 0.9405 | 0.6561 | 0.6129 | 0.6561 | 0.5981 |
58
+ | 0.7207 | 2.0 | 142 | 0.5276 | 0.8442 | 0.8463 | 0.8442 | 0.8406 |
59
+ | 0.4005 | 3.0 | 213 | 0.4997 | 0.8662 | 0.8719 | 0.8662 | 0.8640 |
60
+ | 0.2417 | 4.0 | 284 | 0.4764 | 0.8729 | 0.8731 | 0.8729 | 0.8725 |
61
+ | 0.1757 | 5.0 | 355 | 0.5135 | 0.8812 | 0.8827 | 0.8812 | 0.8810 |
62
+ | 0.1398 | 6.0 | 426 | 0.5266 | 0.8710 | 0.8710 | 0.8710 | 0.8704 |
63
+ | 0.0937 | 7.0 | 497 | 0.5438 | 0.8799 | 0.8801 | 0.8799 | 0.8792 |
64
+ | 0.07 | 8.0 | 568 | 0.5759 | 0.8769 | 0.8770 | 0.8769 | 0.8766 |
65
+ | 0.0552 | 9.0 | 639 | 0.6035 | 0.8745 | 0.8741 | 0.8745 | 0.8738 |
66
+ | 0.0478 | 10.0 | 710 | 0.5974 | 0.8778 | 0.8775 | 0.8778 | 0.8771 |
67
 
68
 
69
  ### Framework versions