Migrate model card from transformers-repo
Browse filesRead announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/tuner007/pegasus_paraphrase/README.md
README.md
ADDED
@@ -0,0 +1,53 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Pegasus for Paraphrasing
|
2 |
+
Pegasus model fine-tuned for paraphrasing
|
3 |
+
|
4 |
+
## Model in Action 🚀
|
5 |
+
```
|
6 |
+
import torch
|
7 |
+
from transformers import PegasusForConditionalGeneration, PegasusTokenizer
|
8 |
+
model_name = 'tuner007/pegasus_paraphrase'
|
9 |
+
torch_device = 'cuda' if torch.cuda.is_available() else 'cpu'
|
10 |
+
tokenizer = PegasusTokenizer.from_pretrained(model_name)
|
11 |
+
model = PegasusForConditionalGeneration.from_pretrained(model_name).to(torch_device)
|
12 |
+
|
13 |
+
def get_response(input_text,num_return_sequences):
|
14 |
+
batch = tokenizer.prepare_seq2seq_batch([input_text],truncation=True,padding='longest',max_length=60, return_tensors="pt").to(torch_device)
|
15 |
+
translated = model.generate(**batch,max_length=60,num_beams=10, num_return_sequences=num_return_sequences, temperature=1.5)
|
16 |
+
tgt_text = tokenizer.batch_decode(translated, skip_special_tokens=True)
|
17 |
+
return tgt_text
|
18 |
+
```
|
19 |
+
#### Example 1:
|
20 |
+
```
|
21 |
+
context = "The ultimate test of your knowledge is your capacity to convey it to another."
|
22 |
+
get_response(context,10)
|
23 |
+
# output:
|
24 |
+
['The test of your knowledge is your ability to convey it.',
|
25 |
+
'The ability to convey your knowledge is the ultimate test of your knowledge.',
|
26 |
+
'The ability to convey your knowledge is the most important test of your knowledge.',
|
27 |
+
'Your capacity to convey your knowledge is the ultimate test of it.',
|
28 |
+
'The test of your knowledge is your ability to communicate it.',
|
29 |
+
'Your capacity to convey your knowledge is the ultimate test of your knowledge.',
|
30 |
+
'Your capacity to convey your knowledge to another is the ultimate test of your knowledge.',
|
31 |
+
'Your capacity to convey your knowledge is the most important test of your knowledge.',
|
32 |
+
'The test of your knowledge is how well you can convey it.',
|
33 |
+
'Your capacity to convey your knowledge is the ultimate test.']
|
34 |
+
```
|
35 |
+
#### Example 2: Question paraphrasing (was not trained on quora dataset)
|
36 |
+
```
|
37 |
+
context = "Which course should I take to get started in data science?"
|
38 |
+
get_response(context,10)
|
39 |
+
# output:
|
40 |
+
['Which data science course should I take?',
|
41 |
+
'Which data science course should I take first?',
|
42 |
+
'Should I take a data science course?',
|
43 |
+
'Which data science class should I take?',
|
44 |
+
'Which data science course should I attend?',
|
45 |
+
'I want to get started in data science.',
|
46 |
+
'Which data science course should I enroll in?',
|
47 |
+
'Which data science course is right for me?',
|
48 |
+
'Which data science course is best for me?',
|
49 |
+
'Which course should I take to get started?']
|
50 |
+
```
|
51 |
+
|
52 |
+
> Created by Arpit Rajauria
|
53 |
+
[![Twitter icon](https://cdn0.iconfinder.com/data/icons/shift-logotypes/32/Twitter-32.png)](https://twitter.com/arpit_rajauria)
|