Daniil Larionov commited on
Commit
a0924ff
1 Parent(s): 993b752

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -28
README.md CHANGED
@@ -13,31 +13,31 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  This model is a fine-tuned version of [./ruBert-base/](https://huggingface.co/./ruBert-base/) on an unknown dataset.
15
  It achieves the following results on the evaluation set:
16
- - Loss: 0.1537
17
- - Causator Precision: 0.7677
18
  - Causator Recall: 0.8352
19
- - Causator F1: 0.8
20
  - Causator Number: 91
21
- - Expiriencer Precision: 0.9048
22
- - Expiriencer Recall: 0.9694
23
- - Expiriencer F1: 0.9360
24
- - Expiriencer Number: 98
25
- - Instrument Precision: 0.6
26
  - Instrument Recall: 1.0
27
- - Instrument F1: 0.7500
28
- - Instrument Number: 6
29
  - Other Precision: 0.0
30
  - Other Recall: 0.0
31
  - Other F1: 0.0
32
  - Other Number: 1
33
- - Predicate Precision: 0.9137
34
- - Predicate Recall: 0.9845
35
- - Predicate F1: 0.9478
36
- - Predicate Number: 129
37
- - Overall Precision: 0.8612
38
- - Overall Recall: 0.9354
39
- - Overall F1: 0.8968
40
- - Overall Accuracy: 0.9661
41
 
42
  ## Model description
43
 
@@ -69,16 +69,16 @@ The following hyperparameters were used during training:
69
 
70
  | Training Loss | Epoch | Step | Validation Loss | Causator Precision | Causator Recall | Causator F1 | Causator Number | Expiriencer Precision | Expiriencer Recall | Expiriencer F1 | Expiriencer Number | Instrument Precision | Instrument Recall | Instrument F1 | Instrument Number | Other Precision | Other Recall | Other F1 | Other Number | Predicate Precision | Predicate Recall | Predicate F1 | Predicate Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
71
  |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:---------------:|:------------:|:--------:|:------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
72
- | 0.3043 | 1.0 | 56 | 0.3538 | 0.75 | 0.6264 | 0.6826 | 91 | 0.7981 | 0.8469 | 0.8218 | 98 | 0.0 | 0.0 | 0.0 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.8741 | 0.9690 | 0.9191 | 129 | 0.8204 | 0.8154 | 0.8179 | 0.9142 |
73
- | 0.2664 | 2.0 | 112 | 0.1961 | 0.8784 | 0.7143 | 0.7879 | 91 | 0.9175 | 0.9082 | 0.9128 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9398 | 0.9690 | 0.9542 | 129 | 0.9076 | 0.8769 | 0.8920 | 0.9399 |
74
- | 0.0373 | 3.0 | 168 | 0.1275 | 0.8706 | 0.8132 | 0.8409 | 91 | 0.9223 | 0.9694 | 0.9453 | 98 | 0.625 | 0.8333 | 0.7143 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9338 | 0.9845 | 0.9585 | 129 | 0.9066 | 0.9262 | 0.9163 | 0.9641 |
75
- | 0.0496 | 4.0 | 224 | 0.1683 | 0.8 | 0.8352 | 0.8172 | 91 | 0.9143 | 0.9796 | 0.9458 | 98 | 0.6667 | 1.0 | 0.8 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9270 | 0.9845 | 0.9549 | 129 | 0.8815 | 0.9385 | 0.9091 | 0.9608 |
76
- | 0.0529 | 5.0 | 280 | 0.1526 | 0.7917 | 0.8352 | 0.8128 | 91 | 0.8991 | 1.0 | 0.9469 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9203 | 0.9845 | 0.9513 | 129 | 0.8697 | 0.9446 | 0.9056 | 0.9627 |
77
- | 0.0419 | 6.0 | 336 | 0.1402 | 0.7755 | 0.8352 | 0.8042 | 91 | 0.8962 | 0.9694 | 0.9314 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9203 | 0.9845 | 0.9513 | 129 | 0.8636 | 0.9354 | 0.8981 | 0.9651 |
78
- | 0.0156 | 7.0 | 392 | 0.1498 | 0.8105 | 0.8462 | 0.8280 | 91 | 0.9048 | 0.9694 | 0.9360 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8739 | 0.9385 | 0.9050 | 0.9661 |
79
- | 0.0066 | 8.0 | 448 | 0.1509 | 0.7835 | 0.8352 | 0.8085 | 91 | 0.9057 | 0.9796 | 0.9412 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8665 | 0.9385 | 0.9010 | 0.9680 |
80
- | 0.0084 | 9.0 | 504 | 0.1548 | 0.7755 | 0.8352 | 0.8042 | 91 | 0.9048 | 0.9694 | 0.9360 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8636 | 0.9354 | 0.8981 | 0.9656 |
81
- | 0.0083 | 10.0 | 560 | 0.1537 | 0.7677 | 0.8352 | 0.8 | 91 | 0.9048 | 0.9694 | 0.9360 | 98 | 0.6 | 1.0 | 0.7500 | 6 | 0.0 | 0.0 | 0.0 | 1 | 0.9137 | 0.9845 | 0.9478 | 129 | 0.8612 | 0.9354 | 0.8968 | 0.9661 |
82
 
83
 
84
  ### Framework versions
 
13
 
14
  This model is a fine-tuned version of [./ruBert-base/](https://huggingface.co/./ruBert-base/) on an unknown dataset.
15
  It achieves the following results on the evaluation set:
16
+ - Loss: 0.1723
17
+ - Causator Precision: 0.8539
18
  - Causator Recall: 0.8352
19
+ - Causator F1: 0.8444
20
  - Causator Number: 91
21
+ - Expiriencer Precision: 0.9259
22
+ - Expiriencer Recall: 0.9740
23
+ - Expiriencer F1: 0.9494
24
+ - Expiriencer Number: 77
25
+ - Instrument Precision: 0.375
26
  - Instrument Recall: 1.0
27
+ - Instrument F1: 0.5455
28
+ - Instrument Number: 3
29
  - Other Precision: 0.0
30
  - Other Recall: 0.0
31
  - Other F1: 0.0
32
  - Other Number: 1
33
+ - Predicate Precision: 0.9352
34
+ - Predicate Recall: 0.9902
35
+ - Predicate F1: 0.9619
36
+ - Predicate Number: 102
37
+ - Overall Precision: 0.8916
38
+ - Overall Recall: 0.9307
39
+ - Overall F1: 0.9107
40
+ - Overall Accuracy: 0.9667
41
 
42
  ## Model description
43
 
 
69
 
70
  | Training Loss | Epoch | Step | Validation Loss | Causator Precision | Causator Recall | Causator F1 | Causator Number | Expiriencer Precision | Expiriencer Recall | Expiriencer F1 | Expiriencer Number | Instrument Precision | Instrument Recall | Instrument F1 | Instrument Number | Other Precision | Other Recall | Other F1 | Other Number | Predicate Precision | Predicate Recall | Predicate F1 | Predicate Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
71
  |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:---------------:|:------------:|:--------:|:------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
72
+ | 0.2552 | 1.0 | 56 | 0.3471 | 0.8841 | 0.6703 | 0.7625 | 91 | 0.8421 | 0.8312 | 0.8366 | 77 | 0.0 | 0.0 | 0.0 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9259 | 0.9804 | 0.9524 | 102 | 0.8893 | 0.8212 | 0.8539 | 0.9203 |
73
+ | 0.2385 | 2.0 | 112 | 0.1608 | 0.9103 | 0.7802 | 0.8402 | 91 | 0.9375 | 0.9740 | 0.9554 | 77 | 0.2857 | 0.6667 | 0.4 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9519 | 0.9706 | 0.9612 | 102 | 0.9182 | 0.9015 | 0.9098 | 0.9554 |
74
+ | 0.0367 | 3.0 | 168 | 0.1311 | 0.8902 | 0.8022 | 0.8439 | 91 | 0.9375 | 0.9740 | 0.9554 | 77 | 0.4286 | 1.0 | 0.6 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9709 | 0.9804 | 0.9756 | 102 | 0.9228 | 0.9161 | 0.9194 | 0.9673 |
75
+ | 0.0494 | 4.0 | 224 | 0.1507 | 0.7812 | 0.8242 | 0.8021 | 91 | 0.9241 | 0.9481 | 0.9359 | 77 | 0.4286 | 1.0 | 0.6 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9524 | 0.9804 | 0.9662 | 102 | 0.8746 | 0.9161 | 0.8948 | 0.9637 |
76
+ | 0.0699 | 5.0 | 280 | 0.1830 | 0.8276 | 0.7912 | 0.8090 | 91 | 0.8941 | 0.9870 | 0.9383 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.875 | 0.9197 | 0.8968 | 0.9560 |
77
+ | 0.0352 | 6.0 | 336 | 0.1994 | 0.7857 | 0.8462 | 0.8148 | 91 | 0.9048 | 0.9870 | 0.9441 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9266 | 0.9902 | 0.9573 | 102 | 0.8595 | 0.9380 | 0.8970 | 0.9572 |
78
+ | 0.0186 | 7.0 | 392 | 0.1657 | 0.8652 | 0.8462 | 0.8556 | 91 | 0.9146 | 0.9740 | 0.9434 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8920 | 0.9343 | 0.9127 | 0.9673 |
79
+ | 0.0052 | 8.0 | 448 | 0.1716 | 0.8556 | 0.8462 | 0.8508 | 91 | 0.9259 | 0.9740 | 0.9494 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8920 | 0.9343 | 0.9127 | 0.9673 |
80
+ | 0.0094 | 9.0 | 504 | 0.1715 | 0.8444 | 0.8352 | 0.8398 | 91 | 0.9259 | 0.9740 | 0.9494 | 77 | 0.4286 | 1.0 | 0.6 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8916 | 0.9307 | 0.9107 | 0.9667 |
81
+ | 0.0078 | 10.0 | 560 | 0.1723 | 0.8539 | 0.8352 | 0.8444 | 91 | 0.9259 | 0.9740 | 0.9494 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8916 | 0.9307 | 0.9107 | 0.9667 |
82
 
83
 
84
  ### Framework versions