lysandre HF staff amyeroberts HF staff commited on
Commit
905a4b6
1 Parent(s): 397f71a

Update README.md (#25)

Browse files

- Update README.md (ffe211cb9bfa2e6673b48eab2af621dcc1c20cda)


Co-authored-by: Amy Roberts <[email protected]>

Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -55,8 +55,8 @@ You can use this model directly with a pipeline for text generation.
55
  >>> from transformers import pipeline
56
 
57
  >>> generator = pipeline('text-generation', model="facebook/opt-2.7b")
58
- >>> generator("Hello, I'm am conscious and")
59
- [{'generated_text': 'Hello, I am conscious and I am a human being.\nI am a human being, and'}]
60
  ```
61
 
62
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
@@ -66,8 +66,8 @@ By default, generation is deterministic. In order to use the top-k sampling, ple
66
 
67
  >>> set_seed(32)
68
  >>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True)
69
- >>> generator("Hello, I'm am conscious and")
70
- [{'generated_text': "Hello, I'm am conscious and I make things. I'm in the creative community, which is"}]
71
  ```
72
 
73
  ### Limitations and bias
 
55
  >>> from transformers import pipeline
56
 
57
  >>> generator = pipeline('text-generation', model="facebook/opt-2.7b")
58
+ >>> generator("What are we having for dinner?")
59
+ [{'generated_text': 'What are we having for dinner?\nI'm thinking pizza.\nI'm thinking tacos.\n'}]
60
  ```
61
 
62
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
 
66
 
67
  >>> set_seed(32)
68
  >>> generator = pipeline('text-generation', model="facebook/opt-2.7b", do_sample=True)
69
+ >>> generator("What are we having for dinner?")
70
+ [{'generated_text': "What are we having for dinner?\nJust pizza?\nWell, I suppose that would suffice."}]
71
  ```
72
 
73
  ### Limitations and bias