rubentito commited on
Commit
b5c9a1e
1 Parent(s): d15a05e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md CHANGED
@@ -1,3 +1,52 @@
1
  ---
2
  license: gpl-3.0
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: gpl-3.0
3
+ tags:
4
+ - DocVQA
5
+ - Document Question Answering
6
+ - Document Visual Question Answering
7
+ datasets:
8
+ - MP-DocVQA
9
+ language:
10
+ - en
11
  ---
12
+
13
+ # T5 base fine-tuned on MP-DocVQA
14
+
15
+ This is [pretrained](https://huggingface.co/t5-base) T5 base and fine-tuned on Multipage DocVQA (MP-DocVQA) dataset.
16
+
17
+
18
+ This model was used as a baseline in [Hierarchical multimodal transformers for Multi-Page DocVQA](https://arxiv.org/pdf/2212.05935.pdf).
19
+ - Results on the MP-DocVQA dataset are reported in Table 2.
20
+ - Training hyperparameters can be found in Table 8 of Appendix D.
21
+ -
22
+
23
+ ## How to use
24
+
25
+ Here is how to use this model to get the features of a given text in PyTorch:
26
+
27
+ ```python
28
+ import torch
29
+ from transformers import T5Tokenizer, T5ForConditionalGeneration
30
+
31
+ tokenizer = LongformerTokenizerFast.from_pretrained("rubentito/t5-base-mpdocvqa")
32
+ model = LongformerForQuestionAnswering.from_pretrained("rubentito/t5-base-mpdocvqa")
33
+
34
+ context = "Huggingface has democratized NLP. Huge thanks to Huggingface for this."
35
+ question = "What has Huggingface done?"
36
+ input_text = "question: {:s} context: {:s}".format(question, context)
37
+
38
+ encoding = tokenizer(input_text, return_tensors="pt")
39
+ output = self.model.generate(**encoding)
40
+ answer = tokenizer.decode(output['sequences'], skip_special_tokens=True)
41
+ ```
42
+
43
+ ## BibTeX entry
44
+
45
+ ```tex
46
+ @article{tito2022hierarchical,
47
+ title={Hierarchical multimodal transformers for Multi-Page DocVQA},
48
+ author={Tito, Rub{\`e}n and Karatzas, Dimosthenis and Valveny, Ernest},
49
+ journal={arXiv preprint arXiv:2212.05935},
50
+ year={2022}
51
+ }
52
+ ```