mireiaplalis
commited on
Commit
•
ef0631a
1
Parent(s):
669681d
Training complete
Browse files
README.md
ADDED
@@ -0,0 +1,122 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
base_model: roberta-base
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
metrics:
|
7 |
+
- precision
|
8 |
+
- recall
|
9 |
+
- f1
|
10 |
+
- accuracy
|
11 |
+
model-index:
|
12 |
+
- name: roberta-basefinetuned-ner-cadec
|
13 |
+
results: []
|
14 |
+
---
|
15 |
+
|
16 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
17 |
+
should probably proofread and complete it, then remove this comment. -->
|
18 |
+
|
19 |
+
# roberta-basefinetuned-ner-cadec
|
20 |
+
|
21 |
+
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
|
22 |
+
It achieves the following results on the evaluation set:
|
23 |
+
- Loss: 0.3874
|
24 |
+
- Precision: 0.4370
|
25 |
+
- Recall: 0.4719
|
26 |
+
- F1: 0.4538
|
27 |
+
- Accuracy: 0.8849
|
28 |
+
- Adr Precision: 0.3917
|
29 |
+
- Adr Recall: 0.4477
|
30 |
+
- Adr F1: 0.4178
|
31 |
+
- Disease Precision: 0.0
|
32 |
+
- Disease Recall: 0.0
|
33 |
+
- Disease F1: 0.0
|
34 |
+
- Drug Precision: 0.7184
|
35 |
+
- Drug Recall: 0.7576
|
36 |
+
- Drug F1: 0.7375
|
37 |
+
- Finding Precision: 0.1389
|
38 |
+
- Finding Recall: 0.1111
|
39 |
+
- Finding F1: 0.1235
|
40 |
+
- Symptom Precision: 0.2353
|
41 |
+
- Symptom Recall: 0.1481
|
42 |
+
- Symptom F1: 0.1818
|
43 |
+
- B-adr Precision: 0.6259
|
44 |
+
- B-adr Recall: 0.6488
|
45 |
+
- B-adr F1: 0.6371
|
46 |
+
- B-disease Precision: 0.0
|
47 |
+
- B-disease Recall: 0.0
|
48 |
+
- B-disease F1: 0.0
|
49 |
+
- B-drug Precision: 0.8589
|
50 |
+
- B-drug Recall: 0.8485
|
51 |
+
- B-drug F1: 0.8537
|
52 |
+
- B-finding Precision: 0.4
|
53 |
+
- B-finding Recall: 0.1778
|
54 |
+
- B-finding F1: 0.2462
|
55 |
+
- B-symptom Precision: 0.2667
|
56 |
+
- B-symptom Recall: 0.16
|
57 |
+
- B-symptom F1: 0.2
|
58 |
+
- I-adr Precision: 0.3877
|
59 |
+
- I-adr Recall: 0.4305
|
60 |
+
- I-adr F1: 0.4079
|
61 |
+
- I-disease Precision: 0.0
|
62 |
+
- I-disease Recall: 0.0
|
63 |
+
- I-disease F1: 0.0
|
64 |
+
- I-drug Precision: 0.7456
|
65 |
+
- I-drug Recall: 0.7636
|
66 |
+
- I-drug F1: 0.7545
|
67 |
+
- I-finding Precision: 0.1429
|
68 |
+
- I-finding Recall: 0.125
|
69 |
+
- I-finding F1: 0.1333
|
70 |
+
- I-symptom Precision: 0.5
|
71 |
+
- I-symptom Recall: 0.1
|
72 |
+
- I-symptom F1: 0.1667
|
73 |
+
- Macro Avg F1: 0.3399
|
74 |
+
- Weighted Avg F1: 0.5527
|
75 |
+
|
76 |
+
## Model description
|
77 |
+
|
78 |
+
More information needed
|
79 |
+
|
80 |
+
## Intended uses & limitations
|
81 |
+
|
82 |
+
More information needed
|
83 |
+
|
84 |
+
## Training and evaluation data
|
85 |
+
|
86 |
+
More information needed
|
87 |
+
|
88 |
+
## Training procedure
|
89 |
+
|
90 |
+
### Training hyperparameters
|
91 |
+
|
92 |
+
The following hyperparameters were used during training:
|
93 |
+
- learning_rate: 2e-05
|
94 |
+
- train_batch_size: 8
|
95 |
+
- eval_batch_size: 8
|
96 |
+
- seed: 42
|
97 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
98 |
+
- lr_scheduler_type: linear
|
99 |
+
- num_epochs: 10
|
100 |
+
|
101 |
+
### Training results
|
102 |
+
|
103 |
+
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | B-adr Precision | B-adr Recall | B-adr F1 | B-disease Precision | B-disease Recall | B-disease F1 | B-drug Precision | B-drug Recall | B-drug F1 | B-finding Precision | B-finding Recall | B-finding F1 | B-symptom Precision | B-symptom Recall | B-symptom F1 | I-adr Precision | I-adr Recall | I-adr F1 | I-disease Precision | I-disease Recall | I-disease F1 | I-drug Precision | I-drug Recall | I-drug F1 | I-finding Precision | I-finding Recall | I-finding F1 | I-symptom Precision | I-symptom Recall | I-symptom F1 | Macro Avg F1 | Weighted Avg F1 |
|
104 |
+
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:-------------:|:----------:|:------:|:-----------------:|:--------------:|:----------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:------------:|:---------------:|
|
105 |
+
| No log | 1.0 | 127 | 0.5344 | 0.3114 | 0.2247 | 0.2611 | 0.8487 | 0.1715 | 0.1505 | 0.1603 | 0.0 | 0.0 | 0.0 | 0.98 | 0.5939 | 0.7396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5933 | 0.2380 | 0.3397 | 0.0 | 0.0 | 0.0 | 1.0 | 0.5939 | 0.7452 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1456 | 0.1347 | 0.1399 | 0.0 | 0.0 | 0.0 | 0.98 | 0.5939 | 0.7396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1965 | 0.3329 |
|
106 |
+
| No log | 2.0 | 254 | 0.4494 | 0.3603 | 0.2946 | 0.3242 | 0.8676 | 0.2676 | 0.2440 | 0.2553 | 0.0 | 0.0 | 0.0 | 0.6519 | 0.6242 | 0.6378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5703 | 0.4280 | 0.4890 | 0.0 | 0.0 | 0.0 | 1.0 | 0.6182 | 0.7640 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2759 | 0.2296 | 0.2506 | 0.0 | 0.0 | 0.0 | 0.7342 | 0.7030 | 0.7183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2222 | 0.4204 |
|
107 |
+
| No log | 3.0 | 381 | 0.4357 | 0.3508 | 0.3758 | 0.3629 | 0.8628 | 0.2656 | 0.3431 | 0.2994 | 0.0 | 0.0 | 0.0 | 0.7451 | 0.6909 | 0.7170 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5393 | 0.5662 | 0.5524 | 0.0 | 0.0 | 0.0 | 0.9375 | 0.7273 | 0.8191 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2402 | 0.3113 | 0.2712 | 0.0 | 0.0 | 0.0 | 0.7550 | 0.6909 | 0.7215 | 1.0 | 0.0312 | 0.0606 | 0.0 | 0.0 | 0.0 | 0.2425 | 0.4573 |
|
108 |
+
| 0.5429 | 4.0 | 508 | 0.4086 | 0.4501 | 0.4170 | 0.4329 | 0.8819 | 0.3612 | 0.3890 | 0.3746 | 0.0 | 0.0 | 0.0 | 0.7922 | 0.7394 | 0.7649 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5988 | 0.5816 | 0.5901 | 0.0 | 0.0 | 0.0 | 0.9209 | 0.7758 | 0.8421 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3793 | 0.3642 | 0.3716 | 0.0 | 0.0 | 0.0 | 0.82 | 0.7455 | 0.7810 | 1.0 | 0.0312 | 0.0606 | 0.0 | 0.0 | 0.0 | 0.2645 | 0.5113 |
|
109 |
+
| 0.5429 | 5.0 | 635 | 0.3806 | 0.4225 | 0.4457 | 0.4338 | 0.8797 | 0.3398 | 0.4165 | 0.3743 | 0.0 | 0.0 | 0.0 | 0.7805 | 0.7758 | 0.7781 | 0.2 | 0.0222 | 0.0400 | 0.5 | 0.0370 | 0.0690 | 0.5844 | 0.6180 | 0.6007 | 0.0 | 0.0 | 0.0 | 0.8535 | 0.8121 | 0.8323 | 0.5 | 0.0222 | 0.0426 | 0.5 | 0.04 | 0.0741 | 0.3346 | 0.4018 | 0.3651 | 0.1667 | 0.0769 | 0.1053 | 0.8153 | 0.7758 | 0.7950 | 0.2 | 0.0312 | 0.0541 | 0.0 | 0.0 | 0.0 | 0.2869 | 0.5170 |
|
110 |
+
| 0.5429 | 6.0 | 762 | 0.3902 | 0.3860 | 0.4419 | 0.4121 | 0.8738 | 0.3329 | 0.4239 | 0.3729 | 0.0 | 0.0 | 0.0 | 0.6349 | 0.7273 | 0.6780 | 0.0833 | 0.0222 | 0.0351 | 0.4 | 0.0741 | 0.125 | 0.5832 | 0.6526 | 0.6159 | 0.0 | 0.0 | 0.0 | 0.7886 | 0.8364 | 0.8118 | 0.3333 | 0.0444 | 0.0784 | 0.4 | 0.08 | 0.1333 | 0.3198 | 0.3996 | 0.3553 | 0.0588 | 0.0769 | 0.0667 | 0.6910 | 0.7455 | 0.7172 | 0.1 | 0.0312 | 0.0476 | 0.0 | 0.0 | 0.0 | 0.2826 | 0.5099 |
|
111 |
+
| 0.5429 | 7.0 | 889 | 0.3776 | 0.4149 | 0.4594 | 0.4360 | 0.8795 | 0.3595 | 0.4367 | 0.3944 | 0.0 | 0.0 | 0.0 | 0.6949 | 0.7455 | 0.7193 | 0.125 | 0.0667 | 0.0870 | 0.3636 | 0.1481 | 0.2105 | 0.6094 | 0.6468 | 0.6276 | 0.0 | 0.0 | 0.0 | 0.8405 | 0.8303 | 0.8354 | 0.4167 | 0.1111 | 0.1754 | 0.4 | 0.16 | 0.2286 | 0.3443 | 0.4150 | 0.3764 | 0.0 | 0.0 | 0.0 | 0.7326 | 0.7636 | 0.7478 | 0.1905 | 0.125 | 0.1509 | 0.0 | 0.0 | 0.0 | 0.3142 | 0.5330 |
|
112 |
+
| 0.3019 | 8.0 | 1016 | 0.3892 | 0.4108 | 0.4657 | 0.4365 | 0.8781 | 0.3488 | 0.4404 | 0.3893 | 0.0 | 0.0 | 0.0 | 0.75 | 0.7636 | 0.7568 | 0.16 | 0.0889 | 0.1143 | 0.2727 | 0.1111 | 0.1579 | 0.5928 | 0.6679 | 0.6282 | 0.0 | 0.0 | 0.0 | 0.8625 | 0.8364 | 0.8492 | 0.4375 | 0.1556 | 0.2295 | 0.3 | 0.12 | 0.1714 | 0.3357 | 0.4172 | 0.3720 | 0.0 | 0.0 | 0.0 | 0.7875 | 0.7636 | 0.7754 | 0.1667 | 0.0938 | 0.1200 | 0.0 | 0.0 | 0.0 | 0.3146 | 0.5366 |
|
113 |
+
| 0.3019 | 9.0 | 1143 | 0.3872 | 0.4463 | 0.4719 | 0.4587 | 0.8845 | 0.3939 | 0.4495 | 0.4199 | 0.0 | 0.0 | 0.0 | 0.7530 | 0.7576 | 0.7553 | 0.1333 | 0.0889 | 0.1067 | 0.2667 | 0.1481 | 0.1905 | 0.6309 | 0.6430 | 0.6369 | 0.0 | 0.0 | 0.0 | 0.8571 | 0.8364 | 0.8466 | 0.4375 | 0.1556 | 0.2295 | 0.3077 | 0.16 | 0.2105 | 0.3893 | 0.4349 | 0.4108 | 0.0 | 0.0 | 0.0 | 0.7764 | 0.7576 | 0.7669 | 0.16 | 0.125 | 0.1404 | 0.6667 | 0.1 | 0.1739 | 0.3416 | 0.5540 |
|
114 |
+
| 0.3019 | 10.0 | 1270 | 0.3874 | 0.4370 | 0.4719 | 0.4538 | 0.8849 | 0.3917 | 0.4477 | 0.4178 | 0.0 | 0.0 | 0.0 | 0.7184 | 0.7576 | 0.7375 | 0.1389 | 0.1111 | 0.1235 | 0.2353 | 0.1481 | 0.1818 | 0.6259 | 0.6488 | 0.6371 | 0.0 | 0.0 | 0.0 | 0.8589 | 0.8485 | 0.8537 | 0.4 | 0.1778 | 0.2462 | 0.2667 | 0.16 | 0.2 | 0.3877 | 0.4305 | 0.4079 | 0.0 | 0.0 | 0.0 | 0.7456 | 0.7636 | 0.7545 | 0.1429 | 0.125 | 0.1333 | 0.5 | 0.1 | 0.1667 | 0.3399 | 0.5527 |
|
115 |
+
|
116 |
+
|
117 |
+
### Framework versions
|
118 |
+
|
119 |
+
- Transformers 4.35.2
|
120 |
+
- Pytorch 2.1.0+cu118
|
121 |
+
- Datasets 2.15.0
|
122 |
+
- Tokenizers 0.15.0
|