AgRoBERTa for QA
This is the roberta-ft-on-agextcorpus model, which was further fine-tuned on:
- (1) @ukp/roberta-base_qa_squad1_pfeiffer for PEFT, and
- (2) finally on AgXQA v1 dataset. It's been trained on question-answer pairs, including unanswerable questions, for the extractive QA task.
Overview
Base Language model: msu-ceco/roberta-ft-on-agextcorpus-2023-12-10_v2
Adapter: AdapterHub/roberta-base_qa_squad1_pfeiffer
Language: English
Downstream-task: Extractive QA
Training data: AgXQA v1
Eval data: AgXQA v1
Infrastructure: Google Collab V100 High-RAM
- Downloads last month
- 0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.