Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Model Overview
|
2 |
+
This is an ELECTRA-Large QA Model trained from https://huggingface.co/google/electra-large-discriminator in two stages. First, it is trained on synthetic adversarial data generated using a BART-Large question generator, and then it is trained on SQuAD and AdversarialQA in a second stage of fine-tuning.
|
3 |
+
|
4 |
+
# Data
|
5 |
+
Training data: SQuAD + AdversarialQA
|
6 |
+
Evaluation data: SQuAD + AdversarialQA
|
7 |
+
|
8 |
+
# Training Process
|
9 |
+
Approx. 1 training epoch on the synthetic data and 2 training epochs on the manually-curated data.
|
10 |
+
|
11 |
+
# Additional Information
|
12 |
+
Please refer to https://arxiv.org/abs/2104.08678 for full details. You can interact with the model on Dynabench here: https://dynabench.org/models/109
|