MoritzLaurer HF staff commited on
Commit
e8602b0
1 Parent(s): 4ef15ac

added easier zero-shot code

Browse files
Files changed (1) hide show
  1. README.md +11 -2
README.md CHANGED
@@ -149,8 +149,17 @@ This multilingual model can perform natural language inference (NLI) on 100 lang
149
  As of December 2021, mDeBERTa-v3-base is the best performing multilingual base-sized transformer model introduced by Microsoft in [this paper](https://arxiv.org/pdf/2111.09543.pdf).
150
 
151
 
152
- ## Intended uses & limitations
153
- #### How to use the model
 
 
 
 
 
 
 
 
 
154
  ```python
155
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
156
  import torch
 
149
  As of December 2021, mDeBERTa-v3-base is the best performing multilingual base-sized transformer model introduced by Microsoft in [this paper](https://arxiv.org/pdf/2111.09543.pdf).
150
 
151
 
152
+ ### How to use the model
153
+ #### Simple zero-shot classification pipeline
154
+ ```python
155
+ from transformers import pipeline
156
+ classifier = pipeline("zero-shot-classification", model="MoritzLaurer/mDeBERTa-v3-base-mnli-xnli")
157
+ sequence_to_classify = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
158
+ candidate_labels = ["politics", "economy", "entertainment", "environment"]
159
+ output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
160
+ print(output)
161
+ ```
162
+ #### NLI use-case
163
  ```python
164
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
165
  import torch