sreejith8100 commited on
Commit
d01ae8f
1 Parent(s): 6d9a5c2

End of training

Browse files
Files changed (2) hide show
  1. README.md +27 -25
  2. pytorch_model.bin +1 -1
README.md CHANGED
@@ -14,14 +14,16 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.6801
18
- - Answer: {'precision': 0.6865671641791045, 'recall': 0.796044499381953, 'f1': 0.7372638809387521, 'number': 809}
19
- - Header: {'precision': 0.30714285714285716, 'recall': 0.36134453781512604, 'f1': 0.33204633204633205, 'number': 119}
20
- - Question: {'precision': 0.7743634767339772, 'recall': 0.828169014084507, 'f1': 0.8003629764065335, 'number': 1065}
21
- - Overall Precision: 0.7077
22
- - Overall Recall: 0.7873
23
- - Overall F1: 0.7454
24
- - Overall Accuracy: 0.8029
 
 
25
 
26
  ## Model description
27
 
@@ -50,23 +52,23 @@ The following hyperparameters were used during training:
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
54
- |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
55
- | 1.7872 | 1.0 | 10 | 1.5976 | {'precision': 0.020486555697823303, 'recall': 0.019777503090234856, 'f1': 0.02012578616352201, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.2535014005602241, 'recall': 0.1699530516431925, 'f1': 0.20348510399100617, 'number': 1065} | 0.1318 | 0.0988 | 0.1130 | 0.3743 |
56
- | 1.4377 | 2.0 | 20 | 1.2582 | {'precision': 0.20262869660460023, 'recall': 0.22867737948084055, 'f1': 0.21486643437862948, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.4521497919556172, 'recall': 0.612206572769953, 'f1': 0.5201435979258078, 'number': 1065} | 0.3553 | 0.4200 | 0.3849 | 0.5972 |
57
- | 1.0609 | 3.0 | 30 | 0.9282 | {'precision': 0.4720496894409938, 'recall': 0.5636588380716935, 'f1': 0.5138028169014084, 'number': 809} | {'precision': 0.08, 'recall': 0.01680672268907563, 'f1': 0.02777777777777778, 'number': 119} | {'precision': 0.5755947812739831, 'recall': 0.704225352112676, 'f1': 0.6334459459459459, 'number': 1065} | 0.5266 | 0.6061 | 0.5636 | 0.7029 |
58
- | 0.8126 | 4.0 | 40 | 0.7805 | {'precision': 0.5814176245210728, 'recall': 0.7503090234857849, 'f1': 0.6551538046411225, 'number': 809} | {'precision': 0.2, 'recall': 0.10084033613445378, 'f1': 0.1340782122905028, 'number': 119} | {'precision': 0.6699916874480466, 'recall': 0.7568075117370892, 'f1': 0.710758377425044, 'number': 1065} | 0.6177 | 0.7150 | 0.6628 | 0.7550 |
59
- | 0.6594 | 5.0 | 50 | 0.7015 | {'precision': 0.6331967213114754, 'recall': 0.7639060568603214, 'f1': 0.6924369747899161, 'number': 809} | {'precision': 0.2222222222222222, 'recall': 0.13445378151260504, 'f1': 0.16753926701570682, 'number': 119} | {'precision': 0.7241681260945709, 'recall': 0.7765258215962442, 'f1': 0.7494336202990485, 'number': 1065} | 0.6671 | 0.7331 | 0.6985 | 0.7820 |
60
- | 0.5617 | 6.0 | 60 | 0.6732 | {'precision': 0.6566844919786097, 'recall': 0.7589616810877626, 'f1': 0.7041284403669724, 'number': 809} | {'precision': 0.2, 'recall': 0.21008403361344538, 'f1': 0.20491803278688528, 'number': 119} | {'precision': 0.7147385103011094, 'recall': 0.8469483568075117, 'f1': 0.7752470992694457, 'number': 1065} | 0.6637 | 0.7732 | 0.7143 | 0.7867 |
61
- | 0.4814 | 7.0 | 70 | 0.6633 | {'precision': 0.6609989373007439, 'recall': 0.7688504326328801, 'f1': 0.7108571428571427, 'number': 809} | {'precision': 0.27586206896551724, 'recall': 0.2689075630252101, 'f1': 0.27234042553191484, 'number': 119} | {'precision': 0.7466442953020134, 'recall': 0.8356807511737089, 'f1': 0.7886575099689852, 'number': 1065} | 0.6865 | 0.7747 | 0.7280 | 0.7961 |
62
- | 0.4351 | 8.0 | 80 | 0.6481 | {'precision': 0.6829533116178067, 'recall': 0.7775030902348579, 'f1': 0.7271676300578035, 'number': 809} | {'precision': 0.2846153846153846, 'recall': 0.31092436974789917, 'f1': 0.29718875502008035, 'number': 119} | {'precision': 0.7567796610169492, 'recall': 0.8384976525821596, 'f1': 0.7955456570155902, 'number': 1065} | 0.6988 | 0.7822 | 0.7382 | 0.7985 |
63
- | 0.3819 | 9.0 | 90 | 0.6559 | {'precision': 0.6789473684210526, 'recall': 0.7972805933250927, 'f1': 0.7333712336554862, 'number': 809} | {'precision': 0.3170731707317073, 'recall': 0.3277310924369748, 'f1': 0.32231404958677684, 'number': 119} | {'precision': 0.7789566755083996, 'recall': 0.8272300469483568, 'f1': 0.802367941712204, 'number': 1065} | 0.7101 | 0.7852 | 0.7458 | 0.8087 |
64
- | 0.349 | 10.0 | 100 | 0.6553 | {'precision': 0.6794055201698513, 'recall': 0.7911001236093943, 'f1': 0.7310108509423187, 'number': 809} | {'precision': 0.33076923076923076, 'recall': 0.36134453781512604, 'f1': 0.34538152610441764, 'number': 119} | {'precision': 0.7757417102966842, 'recall': 0.8347417840375587, 'f1': 0.8041610131162371, 'number': 1065} | 0.7087 | 0.7888 | 0.7466 | 0.8055 |
65
- | 0.3137 | 11.0 | 110 | 0.6590 | {'precision': 0.6915584415584416, 'recall': 0.7898640296662547, 'f1': 0.7374495095210617, 'number': 809} | {'precision': 0.2971014492753623, 'recall': 0.3445378151260504, 'f1': 0.31906614785992216, 'number': 119} | {'precision': 0.7753496503496503, 'recall': 0.8328638497652582, 'f1': 0.8030783159800814, 'number': 1065} | 0.7103 | 0.7863 | 0.7464 | 0.8082 |
66
- | 0.3015 | 12.0 | 120 | 0.6652 | {'precision': 0.6789862724392819, 'recall': 0.7948084054388134, 'f1': 0.7323462414578588, 'number': 809} | {'precision': 0.3049645390070922, 'recall': 0.36134453781512604, 'f1': 0.3307692307692308, 'number': 119} | {'precision': 0.77117903930131, 'recall': 0.8291079812206573, 'f1': 0.7990950226244343, 'number': 1065} | 0.7026 | 0.7873 | 0.7425 | 0.7986 |
67
- | 0.2804 | 13.0 | 130 | 0.6745 | {'precision': 0.6993464052287581, 'recall': 0.7935723114956736, 'f1': 0.7434858135495078, 'number': 809} | {'precision': 0.31386861313868614, 'recall': 0.36134453781512604, 'f1': 0.3359375, 'number': 119} | {'precision': 0.7880143112701252, 'recall': 0.8272300469483568, 'f1': 0.8071461291800274, 'number': 1065} | 0.7207 | 0.7858 | 0.7518 | 0.8055 |
68
- | 0.2658 | 14.0 | 140 | 0.6757 | {'precision': 0.6935483870967742, 'recall': 0.7972805933250927, 'f1': 0.7418056354226568, 'number': 809} | {'precision': 0.31386861313868614, 'recall': 0.36134453781512604, 'f1': 0.3359375, 'number': 119} | {'precision': 0.7793468667255075, 'recall': 0.8291079812206573, 'f1': 0.8034576888080072, 'number': 1065} | 0.7141 | 0.7883 | 0.7493 | 0.8031 |
69
- | 0.2581 | 15.0 | 150 | 0.6801 | {'precision': 0.6865671641791045, 'recall': 0.796044499381953, 'f1': 0.7372638809387521, 'number': 809} | {'precision': 0.30714285714285716, 'recall': 0.36134453781512604, 'f1': 0.33204633204633205, 'number': 119} | {'precision': 0.7743634767339772, 'recall': 0.828169014084507, 'f1': 0.8003629764065335, 'number': 1065} | 0.7077 | 0.7873 | 0.7454 | 0.8029 |
70
 
71
 
72
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.0048
18
+ - Ame: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19}
19
+ - Andom number: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19}
20
+ - Ather Name: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19}
21
+ - Lace Of Birth: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5}
22
+ - Other Name: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19}
23
+ - Overall Precision: 1.0
24
+ - Overall Recall: 1.0
25
+ - Overall F1: 1.0
26
+ - Overall Accuracy: 1.0
27
 
28
  ## Model description
29
 
 
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Ame | Andom number | Ather Name | Itle | Lace Of Birth | Other Name | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
56
+ |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:-------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
57
+ | 2.1325 | 1.0 | 6 | 1.3047 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | {'precision': 1.0, 'recall': 0.05263157894736842, 'f1': 0.1, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | 0.0667 | 0.0123 | 0.0208 | 0.7927 |
58
+ | 0.956 | 2.0 | 12 | 0.5955 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | 0.0 | 0.0 | 0.0 | 0.7967 |
59
+ | 0.5243 | 3.0 | 18 | 0.3361 | {'precision': 0.4358974358974359, 'recall': 0.8947368421052632, 'f1': 0.5862068965517242, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.875, 'recall': 0.3684210526315789, 'f1': 0.5185185185185185, 'number': 19}| 0.6515 | 0.5309 | 0.5850 | 0.9228 |
60
+ | 0.3127 | 4.0 | 24 | 0.1808 | {'precision': 0.76, 'recall': 1.0, 'f1': 0.8636363636363636, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 0.2631578947368421, 'f1': 0.4166666666666667, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.6296296296296297, 'recall': 0.8947368421052632, 'f1': 0.7391304347826088, 'number': 19}| 0.7895 | 0.7407 | 0.7643 | 0.9573 |
61
+ | 0.1878 | 5.0 | 30 | 0.1083 | {'precision': 0.8636363636363636, 'recall': 1.0, 'f1': 0.9268292682926829, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 0.9333333333333333, 'recall': 0.7368421052631579, 'f1': 0.8235294117647058, 'number': 19} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.782608695652174, 'recall': 0.9473684210526315, 'f1': 0.8571428571428571, 'number': 19}| 0.8861 | 0.8642 | 0.8750 | 0.9776 |
62
+ | 0.1243 | 6.0 | 36 | 0.0646 | {'precision': 0.8636363636363636, 'recall': 1.0, 'f1': 0.9268292682926829, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 0.95, 'recall': 1.0, 'f1': 0.9743589743589743, 'number': 19} | {'precision': 1.0, 'recall': 0.2, 'f1': 0.33333333333333337, 'number': 5}| {'precision': 1.0, 'recall': 0.9473684210526315, 'f1': 0.972972972972973, 'number': 19}| 0.95 | 0.9383 | 0.9441 | 0.9898 |
63
+ | 0.0885 | 7.0 | 42 | 0.0352 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
64
+ | 0.0559 | 8.0 | 48 | 0.0190 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
65
+ | 0.0356 | 9.0 | 54 | 0.0123 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
66
+ | 0.0259 | 10.0 | 60 | 0.0091 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
67
+ | 0.0209 | 11.0 | 66 | 0.0071 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
68
+ | 0.0185 | 12.0 | 72 | 0.0060 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
69
+ | 0.0169 | 13.0 | 78 | 0.0053 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
70
+ | 0.0165 | 14.0 | 84 | 0.0049 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
71
+ | 0.0197 | 15.0 | 90 | 0.0048 | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 5} | {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} | 1.0 | 1.0 | 1.0 | 1.0 |
72
 
73
 
74
  ### Framework versions
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0f2e0f95b8a5edebf305c3c253e8372e9d0f3404b7e597218bb13cebef84d395
3
  size 450607041
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a6a790f2952fbf43536f48e7992bf50026aff6e08f2eca5a8bbd07e0a4cbede8
3
  size 450607041