metadata
license: apache-2.0
tags:
- mdeberta-v3-base
- text-classification
- nli
- natural-language-inference
- multilingual
- multitask
- multi-task
- pipeline
- extreme-multi-task
- extreme-mtl
- tasksource
- zero-shot
- rlhf
datasets:
- xnli
- metaeval/xnli
- americas_nli
- MoritzLaurer/multilingual-NLI-26lang-2mil7
- stsb_multi_mt
- paws-x
- miam
- strombergnlp/x-stance
- tyqiangz/multilingual-sentiments
- metaeval/universal-joy
- amazon_reviews_multi
- cardiffnlp/tweet_sentiment_multilingual
- strombergnlp/offenseval_2020
- offenseval_dravidian
- nedjmaou/MLMA_hate_speech
- xglue
- ylacombe/xsum_factuality
- metaeval/x-fact
- pasinit/xlwic
- tasksource/oasst1_dense_flat
- papluca/language-identification
- wili_2018
- exams
- xcsr
- xcopa
- juletxara/xstory_cloze
- Anthropic/hh-rlhf
- universal_dependencies
- tasksource/oasst1_pairwise_rlhf_reward
- OpenAssistant/oasst1
language:
- multilingual
- zh
- ja
- ar
- ko
- de
- fr
- es
- pt
- hi
- id
- it
- tr
- ru
- bn
- ur
- mr
- ta
- vi
- fa
- pl
- uk
- nl
- sv
- he
- sw
- ps
pipeline_tag: zero-shot-classification
Model Card for mDeBERTa-v3-base-tasksource-nli
Multilingual mdeberta-v3-base with 30k steps multi-task training on mtasksource
This model can be used as a stable starting-point for further fine-tuning, or directly in zero-shot NLI model or a zero-shot pipeline.
In addition, you can use the provided adapters to directly load a model for hundreds of tasks.
!pip install tasknet, tasksource -q
import tasknet as tn
pipe=tn.load_pipeline(
'sileod/mdeberta-v3-base-tasksource-nli',
'miam/dihana')
pipe(['si','como esta?'])
Software
https://github.com/sileod/tasksource/
https://github.com/sileod/tasknet/
Contact and citation
For help integrating tasksource into your experiments, please contact [email protected].
For more details, refer to this article:
@article{sileo2023tasksource,
title={tasksource: Structured Dataset Preprocessing Annotations for Frictionless Extreme Multi-Task Learning and Evaluation},
author={Sileo, Damien},
url= {https://arxiv.org/abs/2301.05948},
journal={arXiv preprint arXiv:2301.05948},
year={2023}
}