julien-c HF staff commited on
Commit
eed8e81
1 Parent(s): 17bb8a0

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/google/roberta2roberta_L-24_gigaword/README.md

Files changed (1) hide show
  1. README.md +39 -0
README.md ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ license: apache-2.0
4
+ datasets:
5
+ - gigaword
6
+ tags:
7
+ - summarization
8
+ ---
9
+
10
+ # Roberta2Roberta_L-24_gigaword EncoderDecoder model
11
+
12
+ The model was introduced in
13
+ [this paper](https://arxiv.org/abs/1907.12461) by Sascha Rothe, Shashi Narayan, Aliaksei Severyn and first released in [this repository](https://tfhub.dev/google/bertseq2seq/roberta24_gigaword/1).
14
+
15
+ The model is an encoder-decoder model that was initialized on the `roberta-large` checkpoints for both the encoder
16
+ and decoder and fine-tuned on headline generation using the Gigaword dataset, which is linked above.
17
+
18
+ Disclaimer: The model card has been written by the Hugging Face team.
19
+
20
+ ## How to use
21
+
22
+ You can use this model for extreme summarization, *e.g.*
23
+
24
+ ```python
25
+ from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
26
+
27
+ tokenizer = AutoTokenizer.from_pretrained("google/roberta2roberta_L-24_gigaword")
28
+ model = AutoModelForSeq2SeqLM.from_pretrained("google/roberta2roberta_L-24_gigaword")
29
+
30
+ article = """australian shares closed down #.# percent monday
31
+ following a weak lead from the united states and
32
+ lower commodity prices , dealers said ."""
33
+
34
+ input_ids = tokenizer(article, return_tensors="pt").input_ids
35
+ output_ids = model.generate(input_ids)[0]
36
+ print(tokenizer.decode(output_ids, skip_special_tokens=True))
37
+ # should output
38
+ # australian shares close down #.# percent.
39
+ ```