DunnBC22 commited on
Commit
47c306a
1 Parent(s): 8e374dc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +185 -44
README.md CHANGED
@@ -11,6 +11,10 @@ language:
11
  - en
12
  metrics:
13
  - seqeval
 
 
 
 
14
  pipeline_tag: token-classification
15
  ---
16
 
@@ -19,45 +23,182 @@ pipeline_tag: token-classification
19
  This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the twitter_pos_vcb dataset.
20
  It achieves the following results on the evaluation set:
21
  - Loss: 0.0533
22
- - ''': {'precision': 0.9580645161290322, 'recall': 0.9519230769230769, 'f1': 0.954983922829582, 'number': 312}
23
- - B: {'precision': 0.9658270558694287, 'recall': 0.9655240037652966, 'f1': 0.9656755060411109, 'number': 25496}
24
- - Bd: {'precision': 0.9630099728014506, 'recall': 0.9572819033886085, 'f1': 0.9601373949200036, 'number': 5548}
25
- - Bg: {'precision': 0.9836065573770492, 'recall': 0.9853434575313438, 'f1': 0.9844742413549753, 'number': 5663}
26
- - Bn: {'precision': 0.9182209469153515, 'recall': 0.9116809116809117, 'f1': 0.9149392423159399, 'number': 2106}
27
- - Bp: {'precision': 0.9672037914691943, 'recall': 0.9663488856619736, 'f1': 0.9667761495704902, 'number': 15839}
28
- - Br: {'precision': 0.94, 'recall': 0.8785046728971962, 'f1': 0.9082125603864735, 'number': 107}
29
- - Bs: {'precision': 0.9848484848484849, 'recall': 0.9701492537313433, 'f1': 0.9774436090225564, 'number': 67}
30
- - Bz: {'precision': 0.9865819209039548, 'recall': 0.9850167459897762, 'f1': 0.9857987121813531, 'number': 5673}
31
- - C: {'precision': 0.9993461203138623, 'recall': 0.9993461203138623, 'f1': 0.9993461203138623, 'number': 4588}
32
- - D: {'precision': 0.9876836325864372, 'recall': 0.9895926256318763, 'f1': 0.988637207575195, 'number': 6726}
33
- - Dt: {'precision': 1.0, 'recall': 0.8, 'f1': 0.888888888888889, 'number': 15}
34
- - H: {'precision': 0.9487382595903587, 'recall': 0.9305216426193119, 'f1': 0.9395416596626883, 'number': 9010}
35
- - J: {'precision': 0.9803528468323978, 'recall': 0.980588754311382, 'f1': 0.9804707863816818, 'number': 12467}
36
- - Jr: {'precision': 0.9400386847195358, 'recall': 0.9818181818181818, 'f1': 0.9604743083003953, 'number': 495}
37
- - Js: {'precision': 0.9612141652613828, 'recall': 0.991304347826087, 'f1': 0.9760273972602741, 'number': 575}
38
- - N: {'precision': 0.9795543362923471, 'recall': 0.9793769083475651, 'f1': 0.9794656142847902, 'number': 38646}
39
- - Np: {'precision': 0.9330242966751918, 'recall': 0.9278334128119536, 'f1': 0.9304216147286205, 'number': 6291}
40
- - Nps: {'precision': 0.75, 'recall': 0.23076923076923078, 'f1': 0.3529411764705882, 'number': 26}
41
- - Ns: {'precision': 0.9691858990616282, 'recall': 0.9773657289002557, 'f1': 0.9732586272762003, 'number': 7820}
42
- - O: {'precision': 0.9984323288625675, 'recall': 0.999302649930265, 'f1': 0.9988672998170254, 'number': 5736}
43
- - Os: {'precision': 1.0, 'recall': 0.9952267303102625, 'f1': 0.9976076555023923, 'number': 419}
44
- - P: {'precision': 0.9887869520897044, 'recall': 0.9918200408997955, 'f1': 0.9903011740684022, 'number': 2934}
45
- - Rb: {'precision': 0.9971910112359551, 'recall': 0.9983929288871033, 'f1': 0.9977916081108211, 'number': 2489}
46
- - Rl: {'precision': 1.0, 'recall': 0.9997228381374723, 'f1': 0.9998613998613999, 'number': 3608}
47
- - Rp: {'precision': 0.9979960600502683, 'recall': 0.9980638586956522, 'f1': 0.9980299582215278, 'number': 29440}
48
- - Rp$: {'precision': 0.9975770162686051, 'recall': 0.9972318339100346, 'f1': 0.9974043952240872, 'number': 5780}
49
- - Sr: {'precision': 0.9998923110058152, 'recall': 0.9998384752059442, 'f1': 0.9998653923812088, 'number': 18573}
50
- - T: {'precision': 0.9987569919204475, 'recall': 0.9984811874352779, 'f1': 0.9986190706345371, 'number': 28970}
51
- - W: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
52
- - X: {'precision': 0.9466666666666667, 'recall': 0.9594594594594594, 'f1': 0.9530201342281879, 'number': 74}
53
- - Ym: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5}
54
- - ' ': {'precision': 0.9951481772882245, 'recall': 0.9949524745984923, 'f1': 0.9950503163208444, 'number': 15255}
55
- - '`': {'precision': 0.9540229885057471, 'recall': 0.9595375722543352, 'f1': 0.956772334293948, 'number': 173}
56
-
57
- - Overall Precision: 0.9828
58
- - Overall Recall: 0.9820
59
- - Overall F1: 0.9824
60
- - Overall Accuracy: 0.9860
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61
 
62
  ## Model description
63
 
@@ -86,11 +227,11 @@ The following hyperparameters were used during training:
86
 
87
  ### Training results
88
 
89
- | Training Loss | Epoch | Step | Validation Loss | ' | B | Bd | Bg | Bn | Bp | Br | Bs | Bz | C | D | Dt | H | J | Jr | Js | N | Np | Nps | Ns | O | Os | P | Rb | Rl | Rp | Rp$ | Sr | T | W | X | Ym | | ` | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
90
- |:-------------:|:-----:|:-----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
91
- | 0.0617 | 1.0 | 7477 | 0.0595 | {'precision': 0.9331210191082803, 'recall': 0.9391025641025641, 'f1': 0.9361022364217252, 'number': 312} | {'precision': 0.9562652403051994, 'recall': 0.9536397866331974, 'f1': 0.9549507089273791, 'number': 25496} | {'precision': 0.9716325380424573, 'recall': 0.9322278298485941, 'f1': 0.9515223990433264, 'number': 5548} | {'precision': 0.9810585944414941, 'recall': 0.9786332332685855, 'f1': 0.9798444130127298, 'number': 5663} | {'precision': 0.8725314183123878, 'recall': 0.9230769230769231, 'f1': 0.8970927549607752, 'number': 2106} | {'precision': 0.9555667442885015, 'recall': 0.9585832438916598, 'f1': 0.9570726172465961, 'number': 15839} | {'precision': 0.8878504672897196, 'recall': 0.8878504672897196, 'f1': 0.8878504672897196, 'number': 107} | {'precision': 0.8589743589743589, 'recall': 1.0, 'f1': 0.9241379310344827, 'number': 67} | {'precision': 0.9792873442162542, 'recall': 0.9834302838004583, 'f1': 0.9813544415127529, 'number': 5673} | {'precision': 0.998475277717273, 'recall': 0.999128160418483, 'f1': 0.9988016123760759, 'number': 4588} | {'precision': 0.9818369757826344, 'recall': 0.9885518881950639, 'f1': 0.9851829900726033, 'number': 6726} | {'precision': 1.0, 'recall': 0.8, 'f1': 0.888888888888889, 'number': 15} | {'precision': 0.9391025641025641, 'recall': 0.9105438401775805, 'f1': 0.9246027273751831, 'number': 9010} | {'precision': 0.9706609264131388, 'recall': 0.9765781663591883, 'f1': 0.973610555777689, 'number': 12467} | {'precision': 0.9211538461538461, 'recall': 0.9676767676767677, 'f1': 0.9438423645320195, 'number': 495} | {'precision': 0.9226973684210527, 'recall': 0.9756521739130435, 'f1': 0.94843617920541, 'number': 575} | {'precision': 0.9754043126684636, 'recall': 0.9738394659214408, 'f1': 0.9746212611679399, 'number': 38646} | {'precision': 0.9158227848101266, 'recall': 0.9200445080273406, 'f1': 0.9179287923241615, 'number': 6291} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | {'precision': 0.965710643722116, 'recall': 0.9687979539641943, 'f1': 0.9672518353016277, 'number': 7820} | {'precision': 0.9972154542290289, 'recall': 0.9989539748953975, 'f1': 0.9980839574986937, 'number': 5736} | {'precision': 1.0, 'recall': 0.9928400954653938, 'f1': 0.9964071856287425, 'number': 419} | {'precision': 0.9771428571428571, 'recall': 0.99079754601227, 'f1': 0.9839228295819935, 'number': 2934} | {'precision': 0.9947874899759422, 'recall': 0.9967858577742065, 'f1': 0.99578567128236, 'number': 2489} | {'precision': 1.0, 'recall': 0.9997228381374723, 'f1': 0.9998613998613999, 'number': 3608} | {'precision': 0.9970126960418223, 'recall': 0.9976222826086957, 'f1': 0.9973173961764407, 'number': 29440} | {'precision': 0.9973994452149791, 'recall': 0.9953287197231834, 'f1': 0.9963630065812261, 'number': 5780} | {'precision': 0.9998384665087228, 'recall': 0.9997846336079255, 'f1': 0.9998115493336922, 'number': 18573} | {'precision': 0.9976884595480421, 'recall': 0.9982050396962375, 'f1': 0.9979466827711155, 'number': 28970} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.8860759493670886, 'recall': 0.9459459459459459, 'f1': 0.9150326797385621, 'number': 74} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.9936347529365444, 'recall': 0.9925925925925926, 'f1': 0.9931133993572506, 'number': 15255} | {'precision': 0.9540229885057471, 'recall': 0.9595375722543352, 'f1': 0.956772334293948, 'number': 173} | 0.9779 | 0.9772 | 0.9775 | 0.9821 |
92
- | 0.0407 | 2.0 | 14954 | 0.0531 | {'precision': 0.9605263157894737, 'recall': 0.9358974358974359, 'f1': 0.948051948051948, 'number': 312} | {'precision': 0.9599141295862608, 'recall': 0.9645826796360213, 'f1': 0.9622427419985915, 'number': 25496} | {'precision': 0.9673732718894009, 'recall': 0.9459264599855803, 'f1': 0.95652966372004, 'number': 5548} | {'precision': 0.9833863556026865, 'recall': 0.9825180999470245, 'f1': 0.9829520360392192, 'number': 5663} | {'precision': 0.8920402561756633, 'recall': 0.9259259259259259, 'f1': 0.9086672879776329, 'number': 2106} | {'precision': 0.972785622593068, 'recall': 0.9568785908201275, 'f1': 0.9647665425379548, 'number': 15839} | {'precision': 0.9591836734693877, 'recall': 0.8785046728971962, 'f1': 0.9170731707317074, 'number': 107} | {'precision': 0.9428571428571428, 'recall': 0.9850746268656716, 'f1': 0.9635036496350364, 'number': 67} | {'precision': 0.988999290276792, 'recall': 0.9825489159175039, 'f1': 0.9857635511539481, 'number': 5673} | {'precision': 0.999128350403138, 'recall': 0.9993461203138623, 'f1': 0.9992372234935164, 'number': 4588} | {'precision': 0.9854900799526207, 'recall': 0.9895926256318763, 'f1': 0.9875370919881307, 'number': 6726} | {'precision': 1.0, 'recall': 0.8, 'f1': 0.888888888888889, 'number': 15} | {'precision': 0.9498016997167139, 'recall': 0.9302996670366259, 'f1': 0.9399495374264086, 'number': 9010} | {'precision': 0.9775892428365616, 'recall': 0.9797064249618994, 'f1': 0.978646688834582, 'number': 12467} | {'precision': 0.9124767225325885, 'recall': 0.98989898989899, 'f1': 0.9496124031007753, 'number': 495} | {'precision': 0.948073701842546, 'recall': 0.9843478260869565, 'f1': 0.9658703071672354, 'number': 575} | {'precision': 0.9787719343718411, 'recall': 0.9771257051182528, 'f1': 0.9779481269504189, 'number': 38646} | {'precision': 0.9252336448598131, 'recall': 0.9284692417739628, 'f1': 0.9268486194858775, 'number': 6291} | {'precision': 0.5, 'recall': 0.23076923076923078, 'f1': 0.3157894736842105, 'number': 26} | {'precision': 0.9653734361177808, 'recall': 0.9768542199488491, 'f1': 0.9710798957605034, 'number': 7820} | {'precision': 0.9975635224504003, 'recall': 0.999302649930265, 'f1': 0.9984323288625675, 'number': 5736} | {'precision': 0.9928571428571429, 'recall': 0.9952267303102625, 'f1': 0.9940405244338498, 'number': 419} | {'precision': 0.9861205145565335, 'recall': 0.9928425357873211, 'f1': 0.9894701086956521, 'number': 2934} | {'precision': 0.9971910112359551, 'recall': 0.9983929288871033, 'f1': 0.9977916081108211, 'number': 2489} | {'precision': 1.0, 'recall': 0.9997228381374723, 'f1': 0.9998613998613999, 'number': 3608} | {'precision': 0.9986068164055864, 'recall': 0.9982336956521739, 'f1': 0.9984202211690363, 'number': 29440} | {'precision': 0.9963718037318591, 'recall': 0.9977508650519031, 'f1': 0.997060857538036, 'number': 5780} | {'precision': 0.9999461555029076, 'recall': 0.9998923168039627, 'f1': 0.9999192354287252, 'number': 18573} | {'precision': 0.9985499240436404, 'recall': 0.9983431135657577, 'f1': 0.9984465080954189, 'number': 28970} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.9113924050632911, 'recall': 0.972972972972973, 'f1': 0.9411764705882353, 'number': 74} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.9948932827026319, 'recall': 0.9961324156014422, 'f1': 0.9955124635592387, 'number': 15255} | {'precision': 0.9651162790697675, 'recall': 0.9595375722543352, 'f1': 0.9623188405797102, 'number': 173} | 0.9817 | 0.9808 | 0.9813 | 0.9850 |
93
- | 0.0246 | 3.0 | 22431 | 0.0533 | {'precision': 0.9580645161290322, 'recall': 0.9519230769230769, 'f1': 0.954983922829582, 'number': 312} | {'precision': 0.9658270558694287, 'recall': 0.9655240037652966, 'f1': 0.9656755060411109, 'number': 25496} | {'precision': 0.9630099728014506, 'recall': 0.9572819033886085, 'f1': 0.9601373949200036, 'number': 5548} | {'precision': 0.9836065573770492, 'recall': 0.9853434575313438, 'f1': 0.9844742413549753, 'number': 5663} | {'precision': 0.9182209469153515, 'recall': 0.9116809116809117, 'f1': 0.9149392423159399, 'number': 2106} | {'precision': 0.9672037914691943, 'recall': 0.9663488856619736, 'f1': 0.9667761495704902, 'number': 15839} | {'precision': 0.94, 'recall': 0.8785046728971962, 'f1': 0.9082125603864735, 'number': 107} | {'precision': 0.9848484848484849, 'recall': 0.9701492537313433, 'f1': 0.9774436090225564, 'number': 67} | {'precision': 0.9865819209039548, 'recall': 0.9850167459897762, 'f1': 0.9857987121813531, 'number': 5673} | {'precision': 0.9993461203138623, 'recall': 0.9993461203138623, 'f1': 0.9993461203138623, 'number': 4588} | {'precision': 0.9876836325864372, 'recall': 0.9895926256318763, 'f1': 0.988637207575195, 'number': 6726} | {'precision': 1.0, 'recall': 0.8, 'f1': 0.888888888888889, 'number': 15} | {'precision': 0.9487382595903587, 'recall': 0.9305216426193119, 'f1': 0.9395416596626883, 'number': 9010} | {'precision': 0.9803528468323978, 'recall': 0.980588754311382, 'f1': 0.9804707863816818, 'number': 12467} | {'precision': 0.9400386847195358, 'recall': 0.9818181818181818, 'f1': 0.9604743083003953, 'number': 495} | {'precision': 0.9612141652613828, 'recall': 0.991304347826087, 'f1': 0.9760273972602741, 'number': 575} | {'precision': 0.9795543362923471, 'recall': 0.9793769083475651, 'f1': 0.9794656142847902, 'number': 38646} | {'precision': 0.9330242966751918, 'recall': 0.9278334128119536, 'f1': 0.9304216147286205, 'number': 6291} | {'precision': 0.75, 'recall': 0.23076923076923078, 'f1': 0.3529411764705882, 'number': 26} | {'precision': 0.9691858990616282, 'recall': 0.9773657289002557, 'f1': 0.9732586272762003, 'number': 7820} | {'precision': 0.9984323288625675, 'recall': 0.999302649930265, 'f1': 0.9988672998170254, 'number': 5736} | {'precision': 1.0, 'recall': 0.9952267303102625, 'f1': 0.9976076555023923, 'number': 419} | {'precision': 0.9887869520897044, 'recall': 0.9918200408997955, 'f1': 0.9903011740684022, 'number': 2934} | {'precision': 0.9971910112359551, 'recall': 0.9983929288871033, 'f1': 0.9977916081108211, 'number': 2489} | {'precision': 1.0, 'recall': 0.9997228381374723, 'f1': 0.9998613998613999, 'number': 3608} | {'precision': 0.9979960600502683, 'recall': 0.9980638586956522, 'f1': 0.9980299582215278, 'number': 29440} | {'precision': 0.9975770162686051, 'recall': 0.9972318339100346, 'f1': 0.9974043952240872, 'number': 5780} | {'precision': 0.9998923110058152, 'recall': 0.9998384752059442, 'f1': 0.9998653923812088, 'number': 18573} | {'precision': 0.9987569919204475, 'recall': 0.9984811874352779, 'f1': 0.9986190706345371, 'number': 28970} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.9466666666666667, 'recall': 0.9594594594594594, 'f1': 0.9530201342281879, 'number': 74} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.9951481772882245, 'recall': 0.9949524745984923, 'f1': 0.9950503163208444, 'number': 15255} | {'precision': 0.9540229885057471, 'recall': 0.9595375722543352, 'f1': 0.956772334293948, 'number': 173} | 0.9828 | 0.9820 | 0.9824 | 0.9860 |
94
 
95
 
96
  ### Framework versions
 
11
  - en
12
  metrics:
13
  - seqeval
14
+ - accuracy
15
+ - f1
16
+ - recall
17
+ - precision
18
  pipeline_tag: token-classification
19
  ---
20
 
 
23
  This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the twitter_pos_vcb dataset.
24
  It achieves the following results on the evaluation set:
25
  - Loss: 0.0533
26
+ - '''
27
+ - Precision: 0.9580645161290322
28
+ - Recall: 0.9519230769230769
29
+ - F1: 0.954983922829582
30
+ - Number': 312
31
+ - B
32
+ - Precision: 0.9658270558694287
33
+ - Recall: 0.9655240037652966
34
+ - F1: 0.9656755060411109
35
+ - Number: 25496
36
+ - Bd
37
+ - Precision: 0.9630099728014506
38
+ - Recall: 0.9572819033886085
39
+ - F1: 0.9601373949200036
40
+ - Number: 5548
41
+ - Bg
42
+ - Precision: 0.9836065573770492
43
+ - Recall: 0.9853434575313438
44
+ - F1: 0.9844742413549753
45
+ - Number: 5663
46
+ - Bn
47
+ - Precision: 0.9182209469153515
48
+ - Recall: 0.9116809116809117
49
+ - F1: 0.9149392423159399
50
+ - Number: 2106
51
+ - Bp
52
+ - Precision: 0.9672037914691943
53
+ - Recall: 0.9663488856619736
54
+ - F1: 0.9667761495704902
55
+ - Number': 15839
56
+ - Br
57
+ - Precision: 0.94
58
+ - Recall: 0.8785046728971962
59
+ - F1: 0.9082125603864735
60
+ - Number': 107
61
+ - Bs
62
+ - Precision: 0.9848484848484849
63
+ - Recall': 0.9701492537313433
64
+ - F1: 0.9774436090225564
65
+ - Number': 67
66
+ - Bz
67
+ - Precision: 0.9865819209039548
68
+ - Recall: 0.9850167459897762
69
+ - F1: 0.9857987121813531
70
+ - Number': 5673
71
+ - C
72
+ - Precision: 0.9993461203138623,
73
+ - Recall: 0.9993461203138623,
74
+ - F1: 0.9993461203138623,
75
+ - Number: 4588
76
+ - D
77
+ - Precision: 0.9876836325864372
78
+ - Recall: 0.9895926256318763
79
+ - F1: 0.988637207575195
80
+ - Number: 6726
81
+ - Dt
82
+ - Precision: 1.0
83
+ - Recall: 0.8
84
+ - F1: 0.888888888888889
85
+ - Number: 15
86
+ - H
87
+ - Precision: 0.9487382595903587
88
+ - Recall: 0.9305216426193119
89
+ - F1: 0.9395416596626883
90
+ - Number: 9010
91
+ - J
92
+ - Precision: 0.9803528468323978
93
+ - Recall: 0.980588754311382
94
+ - F1: 0.9804707863816818
95
+ - Number: 12467
96
+ - Jr
97
+ - Precision: 0.9400386847195358
98
+ - Recall: 0.9818181818181818
99
+ - F1': 0.9604743083003953
100
+ - Number': 495
101
+ - Js
102
+ - Precision: 0.9612141652613828
103
+ - Recall: 0.991304347826087
104
+ - F1: 0.9760273972602741
105
+ - Number': 575
106
+ - N
107
+ - Precision: 0.9795543362923471
108
+ - Recall: 0.9793769083475651
109
+ - F1: 0.9794656142847902
110
+ - Number': 38646
111
+ - Np
112
+ - Precision: 0.9330242966751918
113
+ - Recall: 0.9278334128119536
114
+ - F1: 0.9304216147286205
115
+ - Number': 6291
116
+ - Nps
117
+ - Precision: 0.75
118
+ - Recall: 0.23076923076923078
119
+ - F1: 0.3529411764705882
120
+ - Number: 26
121
+ - Ns
122
+ - Precision: 0.9691858990616282
123
+ - Recall: 0.9773657289002557
124
+ - F1: 0.9732586272762003
125
+ - Number': 7820
126
+ - O
127
+ - Precision: 0.9984323288625675
128
+ - Recall: 0.999302649930265
129
+ - F1: 0.9988672998170254
130
+ - Number: 5736
131
+ - Os
132
+ - Precision: 1.0
133
+ - Recall: 0.9952267303102625
134
+ - F1: 0.9976076555023923
135
+ - Number: 419
136
+ - P
137
+ - Precision: 0.9887869520897044
138
+ - Recall: 0.9918200408997955
139
+ - F1: 0.9903011740684022
140
+ - Number: 2934
141
+ - Rb
142
+ - Precision: 0.9971910112359551
143
+ - Recall: 0.9983929288871033
144
+ - F1: 0.9977916081108211
145
+ - Number: 2489
146
+ - Rl
147
+ - Precision: 1.0
148
+ - Recall: 0.9997228381374723
149
+ - F1: 0.9998613998613999
150
+ - Number: 3608
151
+ - Rp
152
+ - Precision: 0.9979960600502683
153
+ - Recall: 0.9980638586956522
154
+ - F1: 0.9980299582215278
155
+ - Number: 29440
156
+ - Rp$
157
+ - Precision: 0.9975770162686051
158
+ - Recall: 0.9972318339100346
159
+ - F1: 0.9974043952240872
160
+ - Number: 5780
161
+ - Sr
162
+ - Precision: 0.9998923110058152
163
+ - Recall: 0.9998384752059442
164
+ - F1: 0.9998653923812088
165
+ - Number: 18573
166
+ - T
167
+ - Precision: 0.9987569919204475
168
+ - Recall: 0.9984811874352779
169
+ - F1: 0.9986190706345371
170
+ - Number: 28970
171
+ - W
172
+ - Precision: 0.0
173
+ - Recall: 0.0
174
+ - F1: 0.0
175
+ - number: 1
176
+ - X
177
+ - Precision: 0.9466666666666667,
178
+ - Recall: 0.9594594594594594,
179
+ - F1 0.9530201342281879,
180
+ - Number: 74}
181
+ - Ym
182
+ - Precision: 0.0
183
+ - Recall: 0.0
184
+ - F1: 0.0
185
+ - Number: 5
186
+ - ' '
187
+ - Precision: 0.9951481772882245
188
+ - Recall: 0.9949524745984923
189
+ - F1: 0.9950503163208444
190
+ - Number: 15255
191
+ - '`'
192
+ - Precision: 0.9540229885057471
193
+ - Recall: 0.9595375722543352
194
+ - F1: 0.956772334293948
195
+ - Number: 173
196
+
197
+ - Overall
198
+ - Precision: 0.9828
199
+ - Recall: 0.9820
200
+ - F1: 0.9824
201
+ - Accuracy: 0.9860
202
 
203
  ## Model description
204
 
 
227
 
228
  ### Training results
229
 
230
+ | Training Loss | Epoch | Step | Validation Loss | ''' Precision | ''' Recall | ''' F1 | ''' Number | B Precision | B Recall | B F1 | B Number | Bd Precision | Bd Recall | Bd F1 | Bd Number | Bg Precision | Bg Recall | Bg F1 | Bg Number | Bn Precision | Bn Recall | Bn F1 | Bn Number | Bp Precision | Bp Recall | Bp F1 | Bp Number | Br Precision | Br Recall | Br F1 | Br Number | Bs precision | Bs Recall | Bs F1 | Bs Number | Bz Precision | Bz Recall | Bz F1 | Bz Number | C Precision | C Recall | C F1 | C Number | D Precision | D Recall | D F1 | D Number | Dt Precision | Dt Recall | Dt F1 | Dt Number | H Precision | H Recall | H F1 | H Number | J Precision | J Recall | J F1 | J Number | Jr Precision | Jr Recall | Jr F1 | Jr Number | Js Precision | Js Recall | Js F1 | Js Number | N Precision | N Recall | N F1 | N Number | Np Precision | Np Recall | Np F1 | Np Number | Nps Precision | Nps Recall | Nps F1 | Nps Number | Ns Precision | Ns Recall | Ns F1 | Ns Number | O Precision | O Recall | O F1 | O Number | Os Precision | Os Recall | Os F1 | Os Number | P Precision | P Recall | P F1 | P Number | Rb Precision | Rb Recall | Rb f1 | Rb Number | Rl Precision | Rl Recall | Rl F1 | Rl Number | Rp Precision | Rp Recall | Rp F1 | Rp Number | Rp$ Precision | Rp$ Recall | Rp$ F1 | Rp$ Number | Sr Precision | Sr Recall | Sr F1 | Sr Number | T Precision | T recall | T F1 | T Number | W Precision | W Recall | W F1 | W Number | X Precision | X Recall | X F1 | X Number | Ym Precision | Ym Recall | Ym F1 | Ym Number | ' ' Precision | ' ' Recall | ' ' F1 | ' ' Number | '`' Precision | '`' Recall | '`' F1 | '`' Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
231
+ |:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|
232
+ | 0.0617 | 1.0 | 7477 | 0.0595 | 0.9331 | 0.9391 | 0.9361 | 312 | 0.9563 | 0.9536 | 0.9550 | 25496 | 0.9716 | 0.9322 | 0.9515 | 5548 | 0.9811 | 0.9786 | 0.9798 | 5663 | 0.8725 | 0.9231 | 0.8971 | 2106 | 0.9556 | 0.9586 | 0.9571 | 15839 | 0.8879 | 0.8879 | 0.8879 | 107 | 0.8590 | 1.0 | 0.9241 | 67 | 0.9793 | 0.9834 | 0.9814 | 5673 | 0.9985 | 0.9991 | 0.9988 | 4588 | 0.9818 | 0.9886 | 0.9852 | 6726 | 1.0 | 0.8 | 0.8889 | 15 | 0.9391 | 0.9105 | 0.9246 | 9010 | 0.9707 | 0.9766 | 0.9736 | 12467 | 0.9212 | 0.9677 | 0.9438 | 495 | 0.9227 | 0.9757 | 0.9484 | 575 | 0.9754 | 0.9738 | 0.9746 | 38646 | 0.9158 | 0.9200 | 0.9179 | 6291 | 0.0 | 0.0 | 0.0 | 26 | 0.9657 | 0.9688 | 0.9673 | 7820 | 0.9972 | 0.9990 | 0.9981 | 5736 | 1.0 | 0.9928 | 0.9964 | 419 | 0.9771 | 0.9908 | 0.9839 | 2934 | 0.9948 | 0.9968 | 0.9958 | 2489 | 1.0 | 0.9997 | 0.9999 | 3608 | 0.9970 | 0.9976 | 0.9973 | 29440 | 0.9974 | 0.9954 | 0.9964 | 5780 | 0.9998 | 0.9998 | 0.9998 | 18573 | 0.9977 | 0.9982 | 0.9979 | 28970 | 0.0 | 0.0 | 0.0 | 1 | 0.8861 | 0.9459 | 0.9150 | 74 | 0.0 | 0.0 | 0.0 | 5 | 0.9936 | 0.9926 | 0.9931 | 15255 | 0.9540 | 0.9595 | 0.9568 | 173 | 0.9779 | 0.9772 | 0.9775 | 0.9821 |
233
+ | 0.0407 | 2.0 | 14954 | 0.0531 | 0.9605 | 0.9359 | 0.9481 | 312 | 0.9599 | 0.9646 | 0.9622 | 25496 | 0.9674 | 0.9459 | 0.9565 | 5548 | 0.9834 | 0.9825 | 0.9830 | 5663 | 0.8920 | 0.9259 | 0.9087 | 2106 | 0.9728 | 0.9569 | 0.9648 | 15839 | 0.9592 | 0.8785 | 0.9171 | 107 | 0.9429 | 0.9851 | 0.9635 | 67 | 0.9890 | 0.9825 | 0.9858 | 5673 | 0.9991 | 0.9993 | 0.9992 | 4588 | 0.9855 | 0.9896 | 0.9875 | 6726 | 1.0 | 0.8 | 0.8889 | 15 | 0.9498 | 0.9303 | 0.9399 | 9010 | 0.9776 | 0.9797 | 0.9786 | 12467 | 0.9125 | 0.9899 | 0.9496 | 495 | 0.9481 | 0.9843 | 0.9659 | 575 | 0.9788 | 0.9771 | 0.9779 | 38646 | 0.9252 | 0.9285 | 0.9268 | 6291 | 0.5 | 0.2308 | 0.3158 | 26 | 0.96534 | 0.9769 | 0.9711 | 7820 | 0.9976 | 0.9993 | 0.9984 | 5736 | 0.9929 | 0.9952 | 0.9940 | 419 | 0.9861 | 0.9928 | 0.9895 | 2934 | 0.9972 | 0.9984 | 0.9978 | 2489 | 1.0 | 0.9997 | 0.9999 | 3608 | 0.9986 | 0.9982 | 0.9984 | 29440 | 0.9964 | 0.9978 | 0.9971 | 5780 | 0.9999 | 0.9999 | 0.9999 | 18573 | 0.9985 | 0.9983 | 0.9984 | 28970 | 0.0 | 0.0 | 0.0 | 1 | 0.9114 | 0.9730 | 0.9412 | 74 | 0.0 | 0.0 | 0.0 | 5 | 0.9949 | 0.9961 | 0.9955 | 15255 | 0.9651 | 0.9595 | 0.9623 | 173 | 0.9817 | 0.9808 | 0.9813 | 0.9850 |
234
+ | 0.0246 | 3.0 | 22431 | 0.0533 | 0.9581 | 0.9519 | 0.9550 | 312 | 0.9658 | 0.9655 | 0.9657 | 25496 | 0.9630 | 0.9573 | 0.9601 | 5548 | 0.9836 | 0.9853 | 0.9845 | 5663 | 0.9182 | 0.9117 | 0.9149 | 2106 | 0.9672 | 0.9663 | 0.9668 | 15839 | 0.94 | 0.8785 | 0.9082 | 107 | 0.9848 | 0.9701 | 0.9774 | 67 | 0.9866 | 0.9850 | 0.9858 | 5673 | 0.9993 | 0.9993 | 0.9993 | 4588 | 0.9877 | 0.9896 | 0.9886 | 6726 | 1.0 | 0.8 | 0.8889 | 15 | 0.9487 | 0.9305 | 0.9395 | 9010 | 0.9804 | 0.9806 | 0.9805 | 12467 | 0.9400 | 0.9818 | 0.9605 | 495 | 0.9612 | 0.9913 | 0.9760 | 575 | 0.9796 | 0.9794 | 0.9795 | 38646 | 0.9330 | 0.9278 | 0.9304 | 6291 | 0.75 | 0.2308 | 0.3529 | 26 | 0.9692 | 0.9774 | 0.9733 | 7820 | 0.9984 | 0.9993 | 0.9989 | 5736 | 1.0 | 0.9952 | 0.9976 | 419 | 0.9888 | 0.9918 | 0.9903 | 2934 | 0.9972 | 0.9984 | 0.9978 | 2489 | 1.0 | 0.9997 | 0.9999 | 3608 | 0.9980 | 0.9981 | 0.9981 | 29440 | 0.9976 | 0.9972 | 0.9974 | 5780 | 0.9999 | 0.9998 | 0.9999 | 18573 | 0.9988 | 0.9985 | 0.9986 | 28970 | 0.0 | 0.0 | 0.0 | 1 | 0.9467 | 0.9595 | 0.9530 | 74 | 0.0 | 0.0 | 0.0 | 5 | 0.9951 | 0.9950 | 0.9951 | 15255 | 0.9540 | 0.9595 | 0.9568 | 173 | 0.9828 | 0.9820 | 0.9824 | 0.9860 |
235
 
236
 
237
  ### Framework versions