sreejith8100 commited on
Commit
85ca93c
1 Parent(s): e8c2dc3

End of training

Browse files
Files changed (3) hide show
  1. README.md +28 -41
  2. pytorch_model.bin +1 -1
  3. tokenizer_config.json +1 -0
README.md CHANGED
@@ -1,6 +1,5 @@
1
  ---
2
- license: cc-by-nc-sa-4.0
3
- base_model: microsoft/layoutlmv2-base-uncased
4
  tags:
5
  - generated_from_trainer
6
  model-index:
@@ -13,33 +12,16 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # layoutlm-funsd
15
 
16
- This model is a fine-tuned version of [microsoft/layoutlmv2-base-uncased](https://huggingface.co/microsoft/layoutlmv2-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.3805
19
- - Ame Precision: 1.0
20
- - Ame Recall: 1.0
21
- - Ame F1: 1.0
22
- - Ame Number: 19
23
- - Andom number Precision: 1.0
24
- - Andom number Recall: 1.0
25
- - Andom number F1: 1.0
26
- - Andom number Number: 19
27
- - Ather Name Precision: 1.0
28
- - Ather Name Recall: 1.0
29
- - Ather Name F1: 1.0
30
- - Ather Name Number: 19
31
- - Lace Of Birth Precision: 1.0
32
- - Lace Of Birth Recall: 1.0
33
- - Lace Of Birth F1: 1.0
34
- - Lace Of Birth Number: 5
35
- - Other Name Precision: 1.0
36
- - Other Name Recall: 1.0
37
- - Other Name F1: 1.0
38
- - Other Name Number: 19
39
- - Overall Precision: 1.0
40
- - Overall Recall: 1.0
41
- - Overall F1: 1.0
42
- - Overall Accuracy: 1.0
43
 
44
  ## Model description
45
 
@@ -64,22 +46,27 @@ The following hyperparameters were used during training:
64
  - seed: 42
65
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
66
  - lr_scheduler_type: linear
67
- - num_epochs: 10
68
 
69
  ### Training results
70
 
71
- | Training Loss | Epoch | Step | Validation Loss | Ame Precision | Ame Recall | Ame F1 | Ame Number | Andom number Precision | Andom number Recall | Andom number F1 | Andom number Number | Ather Name Precision | Ather Name Recall | Ather Name F1 | Ather Name Number | Lace Of Birth Precision | Lace Of Birth Recall | Lace Of Birth F1 | Lace Of Birth Number | Other Name Precision | Other Name Recall | Other Name F1 | Other Name Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
72
- |:-------------:|:-----:|:----:|:---------------:|:-------------:|:----------:|:------:|:----------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
73
- | 1.2231 | 1.0 | 41 | 0.8784 | 0.3220 | 1.0 | 0.4872 | 19 | 1.0 | 1.0 | 1.0 | 19 | 0.0 | 0.0 | 0.0 | 19 | 0.0 | 0.0 | 0.0 | 5 | 0.0 | 0.0 | 0.0 | 19 | 0.4872 | 0.4691 | 0.4780 | 0.9126 |
74
- | 0.8256 | 2.0 | 82 | 0.6942 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
75
- | 0.6803 | 3.0 | 123 | 0.5889 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
76
- | 0.5863 | 4.0 | 164 | 0.5189 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
77
- | 0.5261 | 5.0 | 205 | 0.4713 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
78
- | 0.4835 | 6.0 | 246 | 0.4369 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
79
- | 0.4519 | 7.0 | 287 | 0.4111 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
80
- | 0.4287 | 8.0 | 328 | 0.3938 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
81
- | 0.4142 | 9.0 | 369 | 0.3837 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
82
- | 0.4079 | 10.0 | 410 | 0.3805 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 5 | 1.0 | 1.0 | 1.0 | 19 | 1.0 | 1.0 | 1.0 | 1.0 |
 
 
 
 
 
83
 
84
 
85
  ### Framework versions
 
1
  ---
2
+ base_model: microsoft/layoutlm-base-uncased
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
 
12
 
13
  # layoutlm-funsd
14
 
15
+ This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.6801
18
+ - Answer: {'precision': 0.6865671641791045, 'recall': 0.796044499381953, 'f1': 0.7372638809387521, 'number': 809}
19
+ - Header: {'precision': 0.30714285714285716, 'recall': 0.36134453781512604, 'f1': 0.33204633204633205, 'number': 119}
20
+ - Question: {'precision': 0.7743634767339772, 'recall': 0.828169014084507, 'f1': 0.8003629764065335, 'number': 1065}
21
+ - Overall Precision: 0.7077
22
+ - Overall Recall: 0.7873
23
+ - Overall F1: 0.7454
24
+ - Overall Accuracy: 0.8029
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25
 
26
  ## Model description
27
 
 
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
+ - num_epochs: 15
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
55
+ | 1.7872 | 1.0 | 10 | 1.5976 | {'precision': 0.020486555697823303, 'recall': 0.019777503090234856, 'f1': 0.02012578616352201, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.2535014005602241, 'recall': 0.1699530516431925, 'f1': 0.20348510399100617, 'number': 1065} | 0.1318 | 0.0988 | 0.1130 | 0.3743 |
56
+ | 1.4377 | 2.0 | 20 | 1.2582 | {'precision': 0.20262869660460023, 'recall': 0.22867737948084055, 'f1': 0.21486643437862948, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.4521497919556172, 'recall': 0.612206572769953, 'f1': 0.5201435979258078, 'number': 1065} | 0.3553 | 0.4200 | 0.3849 | 0.5972 |
57
+ | 1.0609 | 3.0 | 30 | 0.9282 | {'precision': 0.4720496894409938, 'recall': 0.5636588380716935, 'f1': 0.5138028169014084, 'number': 809} | {'precision': 0.08, 'recall': 0.01680672268907563, 'f1': 0.02777777777777778, 'number': 119} | {'precision': 0.5755947812739831, 'recall': 0.704225352112676, 'f1': 0.6334459459459459, 'number': 1065} | 0.5266 | 0.6061 | 0.5636 | 0.7029 |
58
+ | 0.8126 | 4.0 | 40 | 0.7805 | {'precision': 0.5814176245210728, 'recall': 0.7503090234857849, 'f1': 0.6551538046411225, 'number': 809} | {'precision': 0.2, 'recall': 0.10084033613445378, 'f1': 0.1340782122905028, 'number': 119} | {'precision': 0.6699916874480466, 'recall': 0.7568075117370892, 'f1': 0.710758377425044, 'number': 1065} | 0.6177 | 0.7150 | 0.6628 | 0.7550 |
59
+ | 0.6594 | 5.0 | 50 | 0.7015 | {'precision': 0.6331967213114754, 'recall': 0.7639060568603214, 'f1': 0.6924369747899161, 'number': 809} | {'precision': 0.2222222222222222, 'recall': 0.13445378151260504, 'f1': 0.16753926701570682, 'number': 119} | {'precision': 0.7241681260945709, 'recall': 0.7765258215962442, 'f1': 0.7494336202990485, 'number': 1065} | 0.6671 | 0.7331 | 0.6985 | 0.7820 |
60
+ | 0.5617 | 6.0 | 60 | 0.6732 | {'precision': 0.6566844919786097, 'recall': 0.7589616810877626, 'f1': 0.7041284403669724, 'number': 809} | {'precision': 0.2, 'recall': 0.21008403361344538, 'f1': 0.20491803278688528, 'number': 119} | {'precision': 0.7147385103011094, 'recall': 0.8469483568075117, 'f1': 0.7752470992694457, 'number': 1065} | 0.6637 | 0.7732 | 0.7143 | 0.7867 |
61
+ | 0.4814 | 7.0 | 70 | 0.6633 | {'precision': 0.6609989373007439, 'recall': 0.7688504326328801, 'f1': 0.7108571428571427, 'number': 809} | {'precision': 0.27586206896551724, 'recall': 0.2689075630252101, 'f1': 0.27234042553191484, 'number': 119} | {'precision': 0.7466442953020134, 'recall': 0.8356807511737089, 'f1': 0.7886575099689852, 'number': 1065} | 0.6865 | 0.7747 | 0.7280 | 0.7961 |
62
+ | 0.4351 | 8.0 | 80 | 0.6481 | {'precision': 0.6829533116178067, 'recall': 0.7775030902348579, 'f1': 0.7271676300578035, 'number': 809} | {'precision': 0.2846153846153846, 'recall': 0.31092436974789917, 'f1': 0.29718875502008035, 'number': 119} | {'precision': 0.7567796610169492, 'recall': 0.8384976525821596, 'f1': 0.7955456570155902, 'number': 1065} | 0.6988 | 0.7822 | 0.7382 | 0.7985 |
63
+ | 0.3819 | 9.0 | 90 | 0.6559 | {'precision': 0.6789473684210526, 'recall': 0.7972805933250927, 'f1': 0.7333712336554862, 'number': 809} | {'precision': 0.3170731707317073, 'recall': 0.3277310924369748, 'f1': 0.32231404958677684, 'number': 119} | {'precision': 0.7789566755083996, 'recall': 0.8272300469483568, 'f1': 0.802367941712204, 'number': 1065} | 0.7101 | 0.7852 | 0.7458 | 0.8087 |
64
+ | 0.349 | 10.0 | 100 | 0.6553 | {'precision': 0.6794055201698513, 'recall': 0.7911001236093943, 'f1': 0.7310108509423187, 'number': 809} | {'precision': 0.33076923076923076, 'recall': 0.36134453781512604, 'f1': 0.34538152610441764, 'number': 119} | {'precision': 0.7757417102966842, 'recall': 0.8347417840375587, 'f1': 0.8041610131162371, 'number': 1065} | 0.7087 | 0.7888 | 0.7466 | 0.8055 |
65
+ | 0.3137 | 11.0 | 110 | 0.6590 | {'precision': 0.6915584415584416, 'recall': 0.7898640296662547, 'f1': 0.7374495095210617, 'number': 809} | {'precision': 0.2971014492753623, 'recall': 0.3445378151260504, 'f1': 0.31906614785992216, 'number': 119} | {'precision': 0.7753496503496503, 'recall': 0.8328638497652582, 'f1': 0.8030783159800814, 'number': 1065} | 0.7103 | 0.7863 | 0.7464 | 0.8082 |
66
+ | 0.3015 | 12.0 | 120 | 0.6652 | {'precision': 0.6789862724392819, 'recall': 0.7948084054388134, 'f1': 0.7323462414578588, 'number': 809} | {'precision': 0.3049645390070922, 'recall': 0.36134453781512604, 'f1': 0.3307692307692308, 'number': 119} | {'precision': 0.77117903930131, 'recall': 0.8291079812206573, 'f1': 0.7990950226244343, 'number': 1065} | 0.7026 | 0.7873 | 0.7425 | 0.7986 |
67
+ | 0.2804 | 13.0 | 130 | 0.6745 | {'precision': 0.6993464052287581, 'recall': 0.7935723114956736, 'f1': 0.7434858135495078, 'number': 809} | {'precision': 0.31386861313868614, 'recall': 0.36134453781512604, 'f1': 0.3359375, 'number': 119} | {'precision': 0.7880143112701252, 'recall': 0.8272300469483568, 'f1': 0.8071461291800274, 'number': 1065} | 0.7207 | 0.7858 | 0.7518 | 0.8055 |
68
+ | 0.2658 | 14.0 | 140 | 0.6757 | {'precision': 0.6935483870967742, 'recall': 0.7972805933250927, 'f1': 0.7418056354226568, 'number': 809} | {'precision': 0.31386861313868614, 'recall': 0.36134453781512604, 'f1': 0.3359375, 'number': 119} | {'precision': 0.7793468667255075, 'recall': 0.8291079812206573, 'f1': 0.8034576888080072, 'number': 1065} | 0.7141 | 0.7883 | 0.7493 | 0.8031 |
69
+ | 0.2581 | 15.0 | 150 | 0.6801 | {'precision': 0.6865671641791045, 'recall': 0.796044499381953, 'f1': 0.7372638809387521, 'number': 809} | {'precision': 0.30714285714285716, 'recall': 0.36134453781512604, 'f1': 0.33204633204633205, 'number': 119} | {'precision': 0.7743634767339772, 'recall': 0.828169014084507, 'f1': 0.8003629764065335, 'number': 1065} | 0.7077 | 0.7873 | 0.7454 | 0.8029 |
70
 
71
 
72
  ### Framework versions
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a8e81ee82d4308783302c355496dbae21350baedb5d310611efc86dac796fcff
3
  size 450603969
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f676e76fff4af505f771fa455bc5574a6b4ebc40f616c063f109b4c22f89ac54
3
  size 450603969
tokenizer_config.json CHANGED
@@ -42,6 +42,7 @@
42
  }
43
  },
44
  "additional_special_tokens": [],
 
45
  "clean_up_tokenization_spaces": true,
46
  "cls_token": "[CLS]",
47
  "cls_token_box": [
 
42
  }
43
  },
44
  "additional_special_tokens": [],
45
+ "apply_ocr": false,
46
  "clean_up_tokenization_spaces": true,
47
  "cls_token": "[CLS]",
48
  "cls_token_box": [