Update README.md
Browse files
README.md
CHANGED
@@ -23,14 +23,31 @@ pipeline_tag: token-classification
|
|
23 |
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased).
|
24 |
It achieves the following results on the evaluation set:
|
25 |
- Loss: 0.0881
|
26 |
-
- Loc
|
27 |
-
-
|
28 |
-
-
|
29 |
-
-
|
30 |
-
-
|
31 |
-
-
|
32 |
-
-
|
33 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
34 |
|
35 |
## Model description
|
36 |
|
@@ -59,11 +76,12 @@ The following hyperparameters were used during training:
|
|
59 |
|
60 |
### Training results
|
61 |
|
62 |
-
| Training Loss | Epoch | Step
|
63 |
-
|
64 |
-
| 0.1 | 1.0 | 5795 | 0.0943
|
65 |
-
| 0.0578 | 2.0 | 11590 | 0.0881
|
66 |
|
|
|
67 |
|
68 |
### Framework versions
|
69 |
|
|
|
23 |
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased).
|
24 |
It achieves the following results on the evaluation set:
|
25 |
- Loss: 0.0881
|
26 |
+
- Loc
|
27 |
+
- Precision: 0.9282034236330398
|
28 |
+
- Recall: 0.9378673383711167
|
29 |
+
- F1: 0.9330103575008353
|
30 |
+
- Number: 5955
|
31 |
+
- Misc
|
32 |
+
- Precision: 0.8336608897623727
|
33 |
+
- Rrecall: 0.9219521833629718
|
34 |
+
- F1: 0.8755864139613436
|
35 |
+
- Number: 5061
|
36 |
+
- Org
|
37 |
+
- Precision: 0.9351851851851852
|
38 |
+
- Recall: 0.9370832125253696
|
39 |
+
- F1: 0.9361332367849385
|
40 |
+
- Number: 3449
|
41 |
+
- Per
|
42 |
+
- Precision: 0.9728037566034045
|
43 |
+
- Recall: 0.9543186180422265
|
44 |
+
- F1: 0.9634725317314214
|
45 |
+
- Number: 5210
|
46 |
+
- Overall
|
47 |
+
- Precision: 0.9145
|
48 |
+
- Recall: 0.9380
|
49 |
+
- F1: 0.9261
|
50 |
+
- Accuracy: 0.9912
|
51 |
|
52 |
## Model description
|
53 |
|
|
|
76 |
|
77 |
### Training results
|
78 |
|
79 |
+
| Training Loss | Epoch | Step | Validation Loss | Loc Precision | Loc Recall | Loc F1 | Loc Number | Misc Precision | Misc Recall | Misc F1 | Misc Number | Org Precision | Org Recall | Org F1 | Org Number | Per Precision | Per Recall | Per F1 | Per Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
80 |
+
|:-------------:|:-----:|:-----:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:------------:|:-----------:|:------------:|:--------:|:----------:|:--------:|:----------:|:---:|
|
81 |
+
| 0.1 | 1.0 | 5795 | 0.0943 | 0.9075 | 0.9429 | 0.9249 | 5955 | 0.8320 | 0.8965 | 0.8630 | 5061 | 0.9151 | 0.9287 | 0.9219 | 3449 | 0.9683 | 0.9499 | 0.9590 | 5210 | 0.9039 | 0.9303 | 0.9169 | 0.9901 |
|
82 |
+
| 0.0578 | 2.0 | 11590 | 0.0881 | 0.9282 | 0.9379 | 0.9330 | 5955 | 0.8337 | 0.9220 | 0.8756 | 5061 | 0.9352 | 0.9371 | 0.9361 | 3449 | 0.9728 | 0.9543 | 0.9635 | 5210 | 0.9145 | 0.9380 | 0.9261 | 0.9912 |
|
83 |
|
84 |
+
* All values in the chart above are rounded to the nearest ten-thousandth.
|
85 |
|
86 |
### Framework versions
|
87 |
|