gpt2-large-babi / README.md
p208p2002's picture
Create README.md
f5eb344
|
raw
history blame
1.46 kB
metadata
datasets:
  - facebook/babi_qa

Fine tune and evaluate transformer model on facebook's bAbi tasks.

Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks

Training Code: p208p2002/bAbi-tasks-with-transformer-model

task_no task_name score
qa1 single-supporting-fact 100
qa2 two-supporting-facts 99.4
qa3 three-supporting-facts 62.0
qa4 two-arg-relations 100
qa5 three-arg-relations 96.5
qa6 yes-no-questions 100
qa7 counting 100
qa8 lists-sets 99.8
qa9 simple-negation 100
qa10 indefinite-knowledge 100
qa11 basic-coreference 100
qa12 conjunction 100
qa13 compound-coreference 100
qa14 time-reasoning 100
qa15 basic-deduction 100
qa16 basic-induction 100
qa17 positional-reasoning 100
qa18 size-reasoning 100
qa19 path-finding 100
qa20 agents-motivations 100
# Please use with the follow template

INPUT_TEMPLATE = """
Context:
{context}

Question:
{question}

Answer:
{answer}
"""

input_text = INPUT_TEMPLATE.format_map({
    "context":context,
    "question":question,
    "answer":answer
}).strip()