|
--- |
|
base_model: microsoft/layoutlm-base-uncased |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- funsd |
|
model-index: |
|
- name: layoutlm-funsd |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# layoutlm-funsd |
|
|
|
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.6878 |
|
- Answer: {'precision': 0.7163198247535597, 'recall': 0.8084054388133498, 'f1': 0.7595818815331011, 'number': 809} |
|
- Header: {'precision': 0.2992125984251969, 'recall': 0.31932773109243695, 'f1': 0.30894308943089427, 'number': 119} |
|
- Question: {'precision': 0.7820738137082601, 'recall': 0.8356807511737089, 'f1': 0.8079891057648662, 'number': 1065} |
|
- Overall Precision: 0.7264 |
|
- Overall Recall: 0.7938 |
|
- Overall F1: 0.7586 |
|
- Overall Accuracy: 0.8119 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 3e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 15 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |
|
|:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:| |
|
| 1.791 | 1.0 | 10 | 1.5434 | {'precision': 0.02383134738771769, 'recall': 0.032138442521631644, 'f1': 0.027368421052631577, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.1517094017094017, 'recall': 0.13333333333333333, 'f1': 0.14192903548225888, 'number': 1065} | 0.0829 | 0.0843 | 0.0836 | 0.4089 | |
|
| 1.4241 | 2.0 | 20 | 1.2209 | {'precision': 0.21506682867557717, 'recall': 0.21878862793572312, 'f1': 0.21691176470588236, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.5193014705882353, 'recall': 0.5305164319248826, 'f1': 0.5248490478402229, 'number': 1065} | 0.3883 | 0.3723 | 0.3801 | 0.6071 | |
|
| 1.0737 | 3.0 | 30 | 0.9131 | {'precision': 0.5521920668058455, 'recall': 0.65389369592089, 'f1': 0.5987549518958688, 'number': 809} | {'precision': 0.10526315789473684, 'recall': 0.03361344537815126, 'f1': 0.050955414012738856, 'number': 119} | {'precision': 0.6549912434325744, 'recall': 0.7023474178403756, 'f1': 0.6778432260987766, 'number': 1065} | 0.5992 | 0.6427 | 0.6202 | 0.7260 | |
|
| 0.8315 | 4.0 | 40 | 0.7824 | {'precision': 0.5840297121634169, 'recall': 0.7775030902348579, 'f1': 0.6670201484623542, 'number': 809} | {'precision': 0.16, 'recall': 0.06722689075630252, 'f1': 0.09467455621301775, 'number': 119} | {'precision': 0.6850044365572315, 'recall': 0.7248826291079812, 'f1': 0.7043795620437956, 'number': 1065} | 0.6251 | 0.7070 | 0.6635 | 0.7595 | |
|
| 0.6888 | 5.0 | 50 | 0.7225 | {'precision': 0.6345945945945946, 'recall': 0.7255871446229913, 'f1': 0.6770472895040369, 'number': 809} | {'precision': 0.2631578947368421, 'recall': 0.16806722689075632, 'f1': 0.20512820512820512, 'number': 119} | {'precision': 0.7260397830018083, 'recall': 0.7539906103286385, 'f1': 0.7397512666973746, 'number': 1065} | 0.6692 | 0.7075 | 0.6878 | 0.7735 | |
|
| 0.5828 | 6.0 | 60 | 0.6844 | {'precision': 0.6371951219512195, 'recall': 0.7750309023485785, 'f1': 0.6993865030674846, 'number': 809} | {'precision': 0.2716049382716049, 'recall': 0.18487394957983194, 'f1': 0.22, 'number': 119} | {'precision': 0.6926196269261963, 'recall': 0.8018779342723005, 'f1': 0.7432550043516102, 'number': 1065} | 0.6540 | 0.7541 | 0.7005 | 0.7860 | |
|
| 0.5049 | 7.0 | 70 | 0.6662 | {'precision': 0.6692056583242655, 'recall': 0.7601977750309024, 'f1': 0.7118055555555556, 'number': 809} | {'precision': 0.2903225806451613, 'recall': 0.226890756302521, 'f1': 0.25471698113207547, 'number': 119} | {'precision': 0.7250213492741246, 'recall': 0.7971830985915493, 'f1': 0.759391771019678, 'number': 1065} | 0.6830 | 0.7481 | 0.7141 | 0.7889 | |
|
| 0.456 | 8.0 | 80 | 0.6507 | {'precision': 0.6635610766045549, 'recall': 0.792336217552534, 'f1': 0.7222535211267606, 'number': 809} | {'precision': 0.225, 'recall': 0.226890756302521, 'f1': 0.22594142259414227, 'number': 119} | {'precision': 0.730899830220713, 'recall': 0.8084507042253521, 'f1': 0.7677218011591618, 'number': 1065} | 0.6754 | 0.7672 | 0.7183 | 0.7962 | |
|
| 0.3999 | 9.0 | 90 | 0.6468 | {'precision': 0.6928034371643395, 'recall': 0.7972805933250927, 'f1': 0.7413793103448275, 'number': 809} | {'precision': 0.25, 'recall': 0.24369747899159663, 'f1': 0.24680851063829787, 'number': 119} | {'precision': 0.7611548556430446, 'recall': 0.8169014084507042, 'f1': 0.7880434782608696, 'number': 1065} | 0.7050 | 0.7747 | 0.7382 | 0.8029 | |
|
| 0.3618 | 10.0 | 100 | 0.6543 | {'precision': 0.7033805888767721, 'recall': 0.7972805933250927, 'f1': 0.7473928157589804, 'number': 809} | {'precision': 0.2689075630252101, 'recall': 0.2689075630252101, 'f1': 0.2689075630252101, 'number': 119} | {'precision': 0.7706502636203867, 'recall': 0.8234741784037559, 'f1': 0.7961870177031322, 'number': 1065} | 0.7148 | 0.7797 | 0.7459 | 0.8043 | |
|
| 0.3279 | 11.0 | 110 | 0.6608 | {'precision': 0.7040704070407041, 'recall': 0.7911001236093943, 'f1': 0.7450523864959255, 'number': 809} | {'precision': 0.2773109243697479, 'recall': 0.2773109243697479, 'f1': 0.2773109243697479, 'number': 119} | {'precision': 0.7676855895196506, 'recall': 0.8253521126760563, 'f1': 0.7954751131221719, 'number': 1065} | 0.7142 | 0.7787 | 0.7451 | 0.8092 | |
|
| 0.3085 | 12.0 | 120 | 0.6735 | {'precision': 0.7003222341568206, 'recall': 0.8059332509270705, 'f1': 0.7494252873563217, 'number': 809} | {'precision': 0.3076923076923077, 'recall': 0.3025210084033613, 'f1': 0.30508474576271183, 'number': 119} | {'precision': 0.772609819121447, 'recall': 0.8422535211267606, 'f1': 0.8059299191374663, 'number': 1065} | 0.7175 | 0.7953 | 0.7544 | 0.8084 | |
|
| 0.2933 | 13.0 | 130 | 0.6795 | {'precision': 0.7088331515812432, 'recall': 0.8034610630407911, 'f1': 0.7531865585168018, 'number': 809} | {'precision': 0.2867647058823529, 'recall': 0.3277310924369748, 'f1': 0.30588235294117644, 'number': 119} | {'precision': 0.7782764811490126, 'recall': 0.8140845070422535, 'f1': 0.7957778797613584, 'number': 1065} | 0.7180 | 0.7807 | 0.7481 | 0.8099 | |
|
| 0.2742 | 14.0 | 140 | 0.6836 | {'precision': 0.7133550488599348, 'recall': 0.8121137206427689, 'f1': 0.7595375722543352, 'number': 809} | {'precision': 0.3064516129032258, 'recall': 0.31932773109243695, 'f1': 0.31275720164609055, 'number': 119} | {'precision': 0.784, 'recall': 0.828169014084507, 'f1': 0.8054794520547947, 'number': 1065} | 0.7267 | 0.7913 | 0.7576 | 0.8115 | |
|
| 0.2699 | 15.0 | 150 | 0.6878 | {'precision': 0.7163198247535597, 'recall': 0.8084054388133498, 'f1': 0.7595818815331011, 'number': 809} | {'precision': 0.2992125984251969, 'recall': 0.31932773109243695, 'f1': 0.30894308943089427, 'number': 119} | {'precision': 0.7820738137082601, 'recall': 0.8356807511737089, 'f1': 0.8079891057648662, 'number': 1065} | 0.7264 | 0.7938 | 0.7586 | 0.8119 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.1 |
|
- Pytorch 2.0.1+cu118 |
|
- Datasets 2.14.5 |
|
- Tokenizers 0.13.3 |
|
|