Text Generation
Transformers
PyTorch
Chinese
llama
text-generation-inference
Inference Endpoints
fzmnm commited on
Commit
241e4d7
1 Parent(s): 4fbd199

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -42,7 +42,7 @@ from transformers import pipeline
42
 
43
  generator = pipeline('text-generation', model='fzmnm/TinyStoriesAdv_92M')
44
  story_prompt = "问:什么是鹦鹉?"
45
- generated_story = generator(story_prompt, max_length=256)
46
 
47
  print(generated_story[0]['generated_text'])
48
  ```
 
42
 
43
  generator = pipeline('text-generation', model='fzmnm/TinyStoriesAdv_92M')
44
  story_prompt = "问:什么是鹦鹉?"
45
+ generated_story = generator(story_prompt, max_length=256, truncation=True)
46
 
47
  print(generated_story[0]['generated_text'])
48
  ```