YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
If you use the model, please consider citing the paper
@misc{bhargava2021generalization,
title={Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics},
author={Prajjwal Bhargava and Aleksandr Drozd and Anna Rogers},
year={2021},
eprint={2110.01518},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Original Implementation and more info can be found in this Github repository.
Roberta-large trained on MNLI.
Task | Accuracy |
---|---|
MNLI | 90.15 |
MNLI-mm | 90.02 |
You can also check out:
prajjwal1/roberta-base-mnli
prajjwal1/roberta-large-mnli
prajjwal1/albert-base-v2-mnli
prajjwal1/albert-base-v1-mnli
prajjwal1/albert-large-v2-mnli
- Downloads last month
- 29
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.