com3dian commited on
Commit
7ed6c7d
1 Parent(s): 4bfecce

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -36,7 +36,7 @@ This particular model, Bart-Large, is the larger version of the Bart model. It c
36
  To use this model, you can leverage the Hugging Face [Transformers](https://huggingface.co/transformers/) library. Here's an example of how to use it in Python:
37
 
38
  ```python
39
- from transformers import BartTokenizer, BartForConditionalGeneration
40
 
41
  # Load the model and tokenizer
42
  model_name = "com3dian/Bart-large-paper2slides-expander"
@@ -49,13 +49,13 @@ input_ids = tokenizer.encode(input_text, return_tensors="pt")
49
  output = model.generate(input_ids)
50
 
51
  # Decode generated summaries
52
- summary = tokenizer.decode(output[0], skip_special_tokens=True)
53
- print(summary)
54
 
55
  # Or using the pipeline API
56
- summarizer = pipeline("summarization", model=model_name)
57
- summary = summarizer(text, max_length=50, min_length=30, do_sample=False)
58
- print(summary)
59
  ```
60
 
61
  Ensure you have the `transformers` library installed before running the code. You can install it using `pip`:
 
36
  To use this model, you can leverage the Hugging Face [Transformers](https://huggingface.co/transformers/) library. Here's an example of how to use it in Python:
37
 
38
  ```python
39
+ from transformers import BartTokenizer, BartForConditionalGeneration, pipeline
40
 
41
  # Load the model and tokenizer
42
  model_name = "com3dian/Bart-large-paper2slides-expander"
 
49
  output = model.generate(input_ids)
50
 
51
  # Decode generated summaries
52
+ expanded_text = tokenizer.decode(output[0], skip_special_tokens=True)
53
+ print(expanded_text)
54
 
55
  # Or using the pipeline API
56
+ expander = pipeline("text2text-generation", model=model_name)
57
+ expanded_text = expander(input_text, max_length=50, min_length=30, do_sample=False)
58
+ print(expanded_text)
59
  ```
60
 
61
  Ensure you have the `transformers` library installed before running the code. You can install it using `pip`: